Skip to main content

reVISit: Scalable Empirical Evaluation of Interactive Visualizations

The reVISit project addresses a critical bottleneck in visualization research: how can we better and more efficiently empirically evaluate visualization techniques? The reVISit infrastructure aims to democratize evaluation of interactive visualization techniques, an area that has been under-explored, due in part to the high technical burden and skills required to create complex online experiments.

The key innovations of this project are:

(1) Software infrastructure for flexible study creation and instrumented data collection, including interaction provenance, insights, and rationales, compatible with online crowdsourced study contexts.

(2) Software infrastructure to wrangle the results data into formats compatible with off-the-shelf analysis tools, and advanced software infrastructure to analyze these diverse data streams that can be used for piloting, quality control, and analyzing usage types, insights, rational, and performance.

These methods will allow visualization researchers to gather empirical evidence about the merits of different interactive visualization techniques. It will allow researchers to understand the types of insights that different techniques support, revealing diverging analysis strategies users may take. Ultimately, these methods will enable a wider set of visualization researchers to run a much broader range of experiments using crowdsourcing than before.

Demo

You can check out a few example projects on our demo page. All of the demos on this site are build from stimuli and examples that you can find in the github repo.

Check out the getting started tutorial to learn how to build your own experiment.

Paper

For a concise description of the project, check out the short paper.

Yiren Ding, Jack Wilburn, Hilson Shrestha, Akim Ndlovu, Kiran Gadhave,
Carolina Nobre, Alexander Lex, Lane Harrison
reVISit: Supporting Scalable Evaluation of Interactive Visualizations
IEEE Visualization and Visual Analytics (VIS), 31-35, doi:10.1109/VIS54172.2023.00015, 2023.

Project Team

reVISit is a project developed at the University of Utah, Worcester Polytechnic Institute, and the University of Toronto.

Alexander Lex, Co-PI, University of Utah
Lane Harrison, Co-PI, WPI
Carolina Nobre, Co-I, University of Toronto
Jack Wilburn, Senior Software Engineer, University of Utah
Zach Cutler, PhD Student, University of Utah
Yiren Ding, PhD Student, WPI
Kiran Gadhave, PhD Student, University of Utah
Akim Ndlovu, PhD Student, WPI
Hilson Shrestha, PhD Student, WPI
Brian Bollen, Senior Software Developer, University of Utah

Steering Committee

reVISit is advised by a steering committee composed of researchers who regularly run a diverse set of studies.

Danielle Albers Szafir, University of North Carolina-Chapel Hill
Cindy Xiong Bearfield, Georgia Tech
Ana Crisan, Tableau Research
Alex Endert, Georgia Tech
Jean-Daniel Fekete, INRIA Paris
Petra Isenberg, INRIA Paris
Lace Padilla, Northeastern University
John Stasko, Georgia Tech
Manuela Waldner, TU Vienna

Contact

If you have any questions, please e-mail us.

Acknowledgements

reVISit is funded by the National Science Foundation, under the title "Collaborative Research: CCRI: New: reVISit: Scalable Empirical Evaluation of Interactive Visualizations", trough CNS with award numbers 2213756 and 2213757.

We are grateful to Cindy Xiong Bearfield, Lace Padilla, and Danielle Albers Szafir for advice on the requirements of a study platform.