reVISit: Scalable Empirical Evaluation of Interactive Visualizations

The reVISit project addresses a critical bottleneck in visualization research: how can we better and more efficiently empirically evaluate visualization techniques? The reVISit infrastructure aims to democratize evaluation of interactive visualization techniques, an area that has been under-explored, due in part to the high technical burden and skills required to create complex online experiments.

The key innovations of this project are:

(1) Software infrastructure for flexible study creation and instrumented data collection, including interaction provenance, insights, and rationales, compatible with online crowdsourced study contexts.

(2) Software infrastructure to wrangle the results data into formats compatible with off-the-shelf analysis tools, and advanced software infrastructure to analyze these diverse data streams that can be used for piloting, quality control, and analyzing usage types, insights, rational, and performance.

These methods will allow visualization researchers to gather empirical evidence about the merits of different interactive visualization techniques. It will allow researchers to understand the types of insights that different techniques support, revealing diverging analysis strategies users may take. Ultimately, these methods will enable a wider set of visualization researchers to run a much broader range of experiments using crowdsourcing than before.

Demo

You can check out a few example projects on our demo page. All of the demos on this site are build from stimuli and examples that you can find in the github repo.

Check out the getting started tutorial to learn how to build your own experiment.

Current Release

ReVISit is released as V 1.0 and ready to use!

A pape that use reVISit for studies:

Maxim Lisnic, Zach Cutler, Marina Kogan, Alexander Lex; Visualization Guardrails: Designing Interventions Against Cherry-Picking in Interactive Data Explorers; Preprint, doi:10.31219/osf.io/4j9nr, 2024.

Paper

For a concise description of the project, check out the short paper.

Yiren Ding, Jack Wilburn, Hilson Shrestha, Akim Ndlovu, Kiran Gadhave,
Carolina Nobre, Alexander Lex, Lane Harrison
reVISit: Supporting Scalable Evaluation of Interactive Visualizations
IEEE Visualization and Visual Analytics (VIS), 31-35, doi:10.1109/VIS54172.2023.00015, 2023.