Discipline: Computer Sciences and Information Management
Subcategory: Computer Science & Information Systems
Mnsa Maat - Philander Smith College
Scientific visualization has merged as an essential tool for analyzing large data generated by scientific simulations. Novel tools have been developed to support data analysis and visualization (DAV) such as Jupyter, UV-CDAT, and ParaView. However, the phenomenon of scientists’ lack of utilization of such tools has been noted and calls have been issued to resolve this issue. In this study we explore the application of usability heuristics to identify usability problems in DAV tools. We hypothesize that usability heuristics inspection method would fit to identify usability issues of DAV tools. The study is conducted to achieve two goals: (i) to explore the suitability of usability heuristics inspection method to identify usability problems in DAV tools; (ii) to develop a framework of usability heuristics to evaluate DAV tools. In this study, we build on the work of Nielsen’s (1994) to propose a 12-rule heuristics framework to be used when evaluating complex DAV tool. The 12-rules are: visibility of system status; match between system and real world; user control; consistency and standards; error prevention; recognition; flexibility and efficiency of use; aesthetic and minimalist design; help users recognize, diagnose, and recover from errors, help and documentation, customization and sense-of-community. Systematic process of usability evaluation is applied. First, we selected three DAV tools, which are, Jupyter, UV-CDAT and ParaView, which are widely used tools to generate visualization by the scientific communities. Second, two usability experts evaluated the three DAV tools against the 12-rule framework. Next, usability issues were identified in each of the tools. Then, evaluators conducted a debriefing session of usability violations against the 12-rules. Fourthly, we rated the usability problems from one to four in which a value of one indicates a minimal cosmetic usability issue and severity value of four indicates a major usability problem. Fifth, a usability report is generated describing the usability problems and their severity per heuristic rules. Finally, evaluators met with the development teams of each of the tools to share findings and suggest future improvement. Results indicate that Jupyter suffers from usability issues related to system visibility, while UV-CDAT shows notable usability issues related to user recognition, user control, documentation and sense-of-community. In regards to ParaView, the inspection reveals that the software lacks providing sufficient error prevention mechanisms when rendering and visualizing objects. The study develops a systematic process for usability heuristics that would assist others in evaluating the usability of DAV tools. We also identify and quantify the usability of three DAV tools. Future research aims to assess the usability of R and ViSit and evaluate the value of the developed process and heuristics to inspect usability of complex analysis and visualization systems.
References: Nielsen, J. 1994. Heuristic evaluation. In Nielsen, J. and Mark, R.L. (Eds.) ‘Usability Inspection Methods’, New York: John Wiley & Sons.
Funder Acknowledgement(s): This work was funded partially by NSF-HBCU UP Award No. 1238895 and Berkeley Lab University Faculty Fellowship (BLUFF) program.
Faculty Advisor: Samar Swaid, email@example.com
Role: It was my role in the research to evaluate and report on the visualization tools described in the abstract. I was one of two evaluators to implement the usability heuristic evaluation and generate findings.