• Skip to main content
  • Skip to after header navigation
  • Skip to site footer
ERN: Emerging Researchers National Conference in STEM

ERN: Emerging Researchers National Conference in STEM

  • About
    • About AAAS
    • About NSF
    • About the Conference
    • Project Team
    • Advisory Board
  • Conference
  • Abstracts
    • Abstract Submission Process
    • Abstract Submission Guidelines
    • Presentation Guidelines
  • Travel Awards
  • Resources
    • Award Winners
    • Code of Conduct-AAAS Meetings
    • Code of Conduct-ERN Conference
    • Conference Agenda
    • Conference Materials
    • Conference Program Books
    • ERN Photo Galleries
    • Events | Opportunities
    • Exhibitor Info
    • HBCU-UP PI/PD Meeting
    • In the News
    • NSF Harassment Policy
    • Plenary Session Videos
    • Professional Development
    • Science Careers Handbook
    • Additional Resources
    • Archives
  • Engage
    • Webinars
    • ERN 10-Year Anniversary Videos
    • Plenary Session Videos
  • Contact Us
  • Login

Heuristics Usability Inspection for Analysis and Visualization Tools

Undergraduate #233
Discipline: Computer Sciences and Information Management
Subcategory: Computer Science & Information Systems

Mnsa Maat - Philander Smith College


Scientific visualization has merged as an essential tool for analyzing large data generated by scientific simulations. Novel tools have been developed to support data analysis and visualization (DAV) such as Jupyter, UV-CDAT, and ParaView. However, the phenomenon of scientists’ lack of utilization of such tools has been noted and calls have been issued to resolve this issue. In this study we explore the application of usability heuristics to identify usability problems in DAV tools. We hypothesize that usability heuristics inspection method would fit to identify usability issues of DAV tools. The study is conducted to achieve two goals: (i) to explore the suitability of usability heuristics inspection method to identify usability problems in DAV tools; (ii) to develop a framework of usability heuristics to evaluate DAV tools. In this study, we build on the work of Nielsen’s (1994) to propose a 12-rule heuristics framework to be used when evaluating complex DAV tool. The 12-rules are: visibility of system status; match between system and real world; user control; consistency and standards; error prevention; recognition; flexibility and efficiency of use; aesthetic and minimalist design; help users recognize, diagnose, and recover from errors, help and documentation, customization and sense-of-community. Systematic process of usability evaluation is applied. First, we selected three DAV tools, which are, Jupyter, UV-CDAT and ParaView, which are widely used tools to generate visualization by the scientific communities. Second, two usability experts evaluated the three DAV tools against the 12-rule framework. Next, usability issues were identified in each of the tools. Then, evaluators conducted a debriefing session of usability violations against the 12-rules. Fourthly, we rated the usability problems from one to four in which a value of one indicates a minimal cosmetic usability issue and severity value of four indicates a major usability problem. Fifth, a usability report is generated describing the usability problems and their severity per heuristic rules. Finally, evaluators met with the development teams of each of the tools to share findings and suggest future improvement. Results indicate that Jupyter suffers from usability issues related to system visibility, while UV-CDAT shows notable usability issues related to user recognition, user control, documentation and sense-of-community. In regards to ParaView, the inspection reveals that the software lacks providing sufficient error prevention mechanisms when rendering and visualizing objects. The study develops a systematic process for usability heuristics that would assist others in evaluating the usability of DAV tools. We also identify and quantify the usability of three DAV tools. Future research aims to assess the usability of R and ViSit and evaluate the value of the developed process and heuristics to inspect usability of complex analysis and visualization systems.
References: Nielsen, J. 1994. Heuristic evaluation. In Nielsen, J. and Mark, R.L. (Eds.) ‘Usability Inspection Methods’, New York: John Wiley & Sons.

Funder Acknowledgement(s): This work was funded partially by NSF-HBCU UP Award No. 1238895 and Berkeley Lab University Faculty Fellowship (BLUFF) program.

Faculty Advisor: Samar Swaid, sswaid@philander.edu

Role: It was my role in the research to evaluate and report on the visualization tools described in the abstract. I was one of two evaluators to implement the usability heuristic evaluation and generate findings.

Sidebar

Abstract Locators

  • Undergraduate Abstract Locator
  • Graduate Abstract Locator

This material is based upon work supported by the National Science Foundation (NSF) under Grant No. DUE-1930047. Any opinions, findings, interpretations, conclusions or recommendations expressed in this material are those of its authors and do not represent the views of the AAAS Board of Directors, the Council of AAAS, AAAS’ membership or the National Science Foundation.

AAAS

1200 New York Ave, NW
Washington,DC 20005
202-326-6400
Contact Us
About Us

  • LinkedIn
  • Facebook
  • Instagram
  • Twitter
  • YouTube

The World’s Largest General Scientific Society

Useful Links

  • Membership
  • Careers at AAAS
  • Privacy Policy
  • Terms of Use

Focus Areas

  • Science Education
  • Science Diplomacy
  • Public Engagement
  • Careers in STEM

Focus Areas

  • Shaping Science Policy
  • Advocacy for Evidence
  • R&D Budget Analysis
  • Human Rights, Ethics & Law

© 2023 American Association for the Advancement of Science