A Crowdsourcing System for Integrated and Reproducible Evaluation in Scientific Visualization

Rickard Englund Linköping University Sathish Kottravel Linköping University Timo Ropinski Ulm University

IEEE Pacific Visualization Symposium, 2016

Abstract

User evaluations have gained increasing importance in visualization research over the past years, as in many cases these evaluations are the only way to support the claims made by visualization researchers. Unfortunately, recent literature reviews show that in comparison to algorithmic performance evaluations, the number of user evaluations is still very low. Reasons for this are the required amount of time to conduct such studies together with the difficulties involved in participant recruitment and result reporting. While it could be shown that the quality of evaluation results and the simplified participant recruitment of crowdsourcing platforms makes this technology a viable alternative to lab experiments when evaluating visualizations, the time for conducting and reporting such evaluations is still very high. In this paper, we propose a software system, which integrates the conduction, the analysis and the reporting of crowdsourced user evaluations directly into the scientific visualization development process. With the proposed system, researchers can conduct and analyze quantitative evaluations on a large scale through an evaluation-centric user interface with only a few mouse clicks. Thus, it becomes possible to perform iterative evaluations during algorithm design, which potentially leads to better results, as compared to the time consuming user evaluations traditionally conducted at the end of the design process. Furthermore, the system is built around a centralized database, which supports an easy reuse of old evaluation designs and the reproduction of old evaluations with new or additional stimuli, which are both driving challenges in scientific visualization research. We will describe the system's design and the considerations made during the design process, and demonstrate the system by conducting three user evaluations, all of which have been published before in the visualization literature.

Bibtex

content_copy
@inproceedings{englund16crowdsourcing,
	title={A Crowdsourcing System for Integrated and Reproducible Evaluation in Scientific Visualization},
	author={Englund, Rickard and Kottravel, Sathish and Ropinski, Timo},
	booktitle={Proceedings of 2016 IEEE Pacific Visualization Symposium, PacificVis 2016, Taipei, Taiwan, April 19-22, 2016}
	year={2016},
	pages={40--47},
	editor={Hansen, Chuck and Viola, Ivan and Yuan, Xiaoru}
}