EValuation of Interactive VisuAl Machine Learning systems

The goal of the EVIVA-ML workshop is to bring together visualization researchers and practitioners to discuss experiences and viewpoints on how to effectively evaluate interactive visual machine learning systems.

Workshop Details

Goal

Recent advances in machine learning saw the rise of powerful automatic methods to build robust predictive models from data. To enhance understanding and improve performance, human-centred approaches have been pursued. In interactive visual machine learning (IVML) systems such as [1,2,3,4,5], a human operator and machine collaborate to achieve a task, mediated by an interactive visual interface. Typically, an IVML system comprises an automated service, a user interface, and a learning component. In IVML, the role of the human operator may be not only to interpret and understand the underlying models or decisions, but to also actively act on, and react to these models. This brings forth first the known problems of intelligibility, trust, and usability issues, but also many open questions with respect to the evaluation of the various facets of the IVML system, both as separate components, and as a holistic entity that includes both human and machine intelligence. Identifying the best evaluation methods to follow for validating machine learning (ML) and interactive visual machine learning (IVML) models remains a challenging topic.

The goal of the EVIVA-ML workshop is to bring together visualization researchers and practitioners to discuss experiences and viewpoints on how to effectively evaluate interactive visual machine learning systems. Ultimately, the workshop aims to form a plan to develop a research agenda for IVML evaluation.

  1. S. Amershi, J. Fogarty, D. Weld. Regroup: Interactive machine learning for on-demand group creation in social networks. SIGCHI Conference on Human Factors in Computing Systems. ACM, 2012.
  2. E. T. Brown, J. Liu, C. E. Brodley, R. Chang. Dis-function: Learning distance functions interactively. Visual Analytics Science and Technology (VAST). IEEE, 2012.
  3. W. Cancino, N. Boukhelifa, E. Lutton. Evographdice: Interactive evolution for visual analytics. Evolutionary Computation (CEC). IEEE, 2012.
  4. M. El-Assady, R. Sevastjanova, F. Sperrle, D. Keim, C. Collins. Progressive learning of topic modeling parameters: a visual analytics framework. Transactions on Visualization and Computer Graphics. IEEE, 2018.
  5. H. Kim, J. Choo, H. Park, A. Endert. Interaxis: Steering scatterplot axes via observation-level interaction. Transactions on Visualization and Computer Graphics. IEEE, 2016.

 

Topics

The workshop aims to foster discussion on topics related to the evaluation of interactive visual machine learning systems, including but not limited to:

  • User studies of existing or novel IVML systems
  • Computational and automatic evaluation of IVML systems
  • Comparative studies of variants of IVML systems
  • Case studies to evaluate the usability of IVML systems
  • Surveys on evaluation methods for IVML
  • Novel evaluation methods for IVML
  • Applications of existing evaluation methods for IVML
  • Heuristic and other low-cost approaches for evaluating IVML
  • Evaluation metrics for IVML (e.g., integrating model and user metrics)
  • Taxonomies of tasks for IVML
  • Lessons learnt and reflections on evaluation methods for IVML

 

Submissions

We invite short paper submissions (research or position papers) varying between two to four pages (excluding references). Submissions will be reviewed by the organizing committee and selected external reviewers, and will be chosen according to relevance, quality, and likelihood that they will stimulate and contribute to the discussion.

Submissions must be formatted according to the VGTC conference style template.

Papers are to be submitted online through the Precision Conference System (link TBA).

Accepted contributions will be made available electronically as a collection of preprints. Authors will retain copyright.

 

Important Dates

  • Submission deadline: July 1, 2019
  • Author notification: August 5, 2019
  • Camera-ready deadline: August 20, 2019
  • Workshop: October 2019

 

Organizing Committee

 

Advisory Committee