Discipline: Computer Sciences and Information Management
Subcategory: Computer Science & Information Systems
Tayo Amuneke - Borough of Manhattan Community College, CUNY
Co-Author(s): Hao Tang, Borough of Manhattan Community College, City University of New York, New York, NY
Visually impaired people find it very difficult to travel in unfamiliar environments without a means of guidance. To improve the life quality of visually impaired people, this project aims to create a mobile game which can be used to learn maps and routes in indoor environments. The game accepts input through touch screen while sending out feedbacks to users through audio and vibration output. The statistics of the game play experience are provided and hence the blind users know how well they are familiar with the environment. They can otherwise repeat the map learning until they are familiar with the environment. The proposed pre-journey method is easy to be applied and will encourage visually impaired to travel independently. This project focuses on the user interface design of the mobile app. It is very important for the visually impaired users to have an accessible human computer interaction with the app. In this project, a convenient user interface for visually impaired has been designed, by taking into account various factors: such as the orientation of the game interface, vibro-audio communication, flexible settings, the level of vibro-audio detail and the accessibility mode of the app. Our current project integrates the game development engine and the android native development engine using Java Native Interface. This in turn presents the learning simulation in a more accessible mode because of compatibility with the native operating system. The need for the native accessibility feature to be compatible with the application is due to the fact that participants during our initial experiments had the accessibility feature (Android TalkBack and iOS VoiceOver) turned on by default. In the future, we hope to enhance the accessibility feature of the game with more options and use automation in generating semantic maps. We also plan to conduct more user studies with a larger set of participants. Future studies will involve the use of 3D printer generated map layouts for participants to test their ability to identify the experimental map. This is also important in getting valuable feedback for more improvements on the project going forward.
Funder Acknowledgement(s): This study is supported by a grant from NYC-LSAMP awarded to Tayo Amuneke.
Faculty Advisor: Hao Tang, hao.tang@gmail.com
Role: The visually impaired users arguably have more need for how to communicate with technology than the technology itself. My part of the research focuses on this communication between the mobile device and the user, the Human Computer Interaction. It was observed that the visually impaired participants had the accessibility mode of their mobile devices turned on at all times. Android Studio in connection with Unity 3D game development engine was used to provide effective accessibility mode compatibility on the Android OS platform for the mobile game. This was done through the Java Native Interface tool which was in turn used to communicate across both platforms.