Skip to main content

What is reVISit?

The reVISit project addresses a critical bottleneck in user-centered research: how can we better and more efficiently run web-based experiments? ReVISit aims to democratize evaluation of all kinds of experiments, ranging from perceptual experiments to interactive visualization techniques. ReVISit is most useful for complex, interactive experiments, an area that has been under-explored, due in part to the high technical burden and skills required to them.

The key innovations of this project are:

  1. Flexible study creation based on a domain-specific language, the reVISit Spec.
  2. Simple data collection — out of the box for standard response types, but with the ability to track detailed events based on provenance-enabled stimuli.
  3. Advanced modalities — record participants audio (think aloud studies) and screens (video) which is useful for debugging and qualitative analysis.
  4. Simple data storage — no need to run your own servers: data is stored in a relatively easy to set-up Firebase instance. Of course, you can also self-hosted server if you want full control.
  5. Open source and free – share your study design with anyone, no license required.
  6. Compatible with crowdsourcing platforms — recruit your participants through your preferred provider, such as Prolific, Mechanical Turk, or Lab in the Wild.
  7. Keep track of study progress — see how participants are doing and identify issues with your study quickly.
  8. Export your data — in a format suitable for your analysis.

Demo and Examples

You can check out demonstrations of individual features on our demo page. All of the demos on this site are built from stimuli and examples that you can find in the GitHub repo.

You can also check out example studies that have been built with reVISit on our example studies page. These are full studies with all the complexities that you encounter, and that have been analyzed and published. You can also explore the analytics interface for these studies to see how the data is collected and stored.

Check out the getting started tutorial to learn how to build your own experiment.

Paper

If you are using reVISit for a paper, please cite:

Paper Reference

Zach Cutler, Jack Wilburn, Hilson Shrestha, Yiren Ding, Brian Bollen, Khandaker Abrar Nadib, Tingying He, Andrew McNutt, Lane Harrison, Alexander Lex. ReVISit 2: A Full Experiment Life Cycle User Study Framework. IEEE Transactions on Visualization and Computer Graphics (VIS), 32(1): 13-23, doi:10.1109/TVCG.2025.3633896, 2026.  IEEE VIS 2025 Best Paper Award

If you use version 1 of reVISit, please cite:

Paper Reference

Yiren Ding, Jack Wilburn, Hilson Shrestha, Akim Ndlovu, Kiran Gadhave, Carolina Nobre, Alexander Lex, Lane Harrison. reVISit: Supporting Scalable Evaluation of Interactive Visualizations IEEE Visualization and Visual Analytics (VIS), 31-35, doi:10.1109/VIS54172.2023.00015, 2023.

Project Team

ReVISit is a project developed at the University of Utah and Worcester Polytechnic Institute.

Alexander Lex, Co-PI, Graz University of Technology and University of Utah
Lane Harrison, Co-PI, WPI
Zach Cutler, PhD Student, University of Utah
Yiren Ding, PhD Student, WPI
Tingying He, Postdoc, University of Utah
Jay Kim, Software Engineer, University of Utah
Andrew McNutt, Assistant Professor, University of Utah
Abhraneel Sarma , Postdoc, Graz University of Technology Hilson Shrestha, PhD Student, WPI
Jack Wilburn, Senior Software Engineer, University of Utah

Alumni

Carolina Nobre, Co-I, University of Toronto
Brian Bollen, Senior Software Developer, University of Utah
Kiran Gadhave, PhD Student, University of Utah
Akim Ndlovu, PhD Student, WPI

Contributors

Sheng Long, PhD Student, Northwestern University

Contact

If you have any questions, please e-mail us.

Acknowledgements

ReVISit is funded by the National Science Foundation, under the title “Collaborative Research: CCRI: New: reVISit: Scalable Empirical Evaluation of Interactive Visualizations”, through CNS with award numbers 2213756 and 2213757.

We are grateful to Cindy Xiong Bearfield, Lace Padilla, and Danielle Albers Szafir for advice on the requirements of a study platform.

Papers Published as part of the NSF CCRI Grant

Paper Reference

Zach Cutler, Jack Wilburn, Hilson Shrestha, Yiren Ding, Brian Bollen, Khandaker Abrar Nadib, Tingying He, Andrew McNutt, Lane Harrison, Alexander Lex. ReVISit 2: A Full Experiment Life Cycle User Study Framework. IEEE Transactions on Visualization and Computer Graphics (VIS), 32(1): 13-23, doi:10.1109/TVCG.2025.3633896, 2026.  IEEE VIS 2025 Best Paper Award

Paper Reference

Yiren Ding, Jack Wilburn, Hilson Shrestha, Akim Ndlovu, Kiran Gadhave, Carolina Nobre, Alexander Lex, Lane Harrison. reVISit: Supporting Scalable Evaluation of Interactive Visualizations IEEE Visualization and Visual Analytics (VIS), 31-35, doi:10.1109/VIS54172.2023.00015, 2023.

Paper Reference

Maxim Lisnic, Zach Cutler, Marina Kogan, Alexander Lex Visualization Guardrails: Designing Interventions Against Cherry-Picking in Interactive Data Explorers SIGCHI Conference on Human Factors in Computing Systems (CHI), 1-19, doi:10.1145/3706598.3713385, 2025.

Paper Reference

Zach Cutler, Lane Harrison, Carolina Nobre, Alexander Lex Crowdsourced Think-Aloud Studies SIGCHI Conference on Human Factors in Computing Systems (CHI), 1-23, doi:10.1145/3706598.3714305, 2025.