• Skip to main content
  • Skip to after header navigation
  • Skip to site footer
ERN: Emerging Researchers National Conference in STEM

ERN: Emerging Researchers National Conference in STEM

  • About
    • About AAAS
    • About the NSF
    • About the Conference
    • Partners/Supporters
    • Project Team
  • Conference
  • Abstracts
    • Undergraduate Abstract Locator
    • Graduate Abstract Locator
    • Abstract Submission Process
    • Presentation Schedules
    • Abstract Submission Guidelines
    • Presentation Guidelines
  • Travel Awards
  • Resources
    • Award Winners
    • Code of Conduct-AAAS Meetings
    • Code of Conduct-ERN Conference
    • Conference Agenda
    • Conference Materials
    • Conference Program Books
    • ERN Photo Galleries
    • Events | Opportunities
    • Exhibitor Info
    • HBCU-UP/CREST PI/PD Meeting
    • In the News
    • NSF Harassment Policy
    • Plenary Session Videos
    • Professional Development
    • Science Careers Handbook
    • Additional Resources
    • Archives
  • Engage
    • Webinars
    • ERN 10-Year Anniversary Videos
    • Plenary Session Videos
  • Contact Us
  • Login

iPhone Indoor Navigation System for Visually Impaired People

Undergraduate #123
Discipline: Technology and Engineering
Subcategory: Computer Engineering

Jeury Mejia - Borough of Manhattan Community College
Co-Author(s): Feng Hu, The Graduate Center, CUNY Hao Tang, Borough of Manhattan Community College, CUNY



Localizing visually impaired people within an indoor environment is of major importance, since this information could be used to help reduce the risk they take while mobilizing and navigating. In this work, we implement an iPhone-based indoor navigation system to help the blind to deal with localization and navigation issues. The system consists of a smartphone with a special designed omnidirectional lens mounted on its case. The lens takes omnidirectional images, which represent panoramic information of the environment in a single shot. All the indoor environment information covered within this take are compacted, and concise features are extracted from them and further stored in a remote GPUenabled server. When users want to localize themselves, they just need to capture a short period video and the app will send the compressed information to the server and the server will calculate the location and feedback this information back to the user. In the server, this information is compared to information already stored in a large frame-based database and the current location is identified finding the best matched frame. We apply TCP/IP socket communication to establish a connection between the front-end iPhone app and the server; the latter can potentially provide service to multiple users simultaneously, which makes the system scalable. The application provides real time data acquisition, transmission and query, and few phone’s resources is needed as most of the processing takes place in the server. Currently, the app’s UI (User Interface) consists of three main panels: user interface, admin interface, and network interface. The user’s panel, which is for the visually impaired, is simple and intuitive to use and allows the blind user to record a short period video by pressing a big button in the middle of the screen; as they release the button the app will provide audio feedback with the current location in the environment. The admin and network interfaces are intended for researchpurpose and is to help the developers tune in different parameters (e.g., video quality, video recording, IP address, etc) that take place in the whole process. Our system is low-cost and easy-to-use, factors that make it a viable solution to issues regarding safety and accessibility of blind people.

Funder Acknowledgement(s): This work is supported by NSF EFRI-REM under Award # EFRI-1137172, to the City College of New York.

Faculty Advisor: Zhigang Zhu,

Sidebar

Abstract Locators

  • Undergraduate Abstract Locator
  • Graduate Abstract Locator

This material is based upon work supported by the National Science Foundation (NSF) under Grant No. DUE-1930047. Any opinions, findings, interpretations, conclusions or recommendations expressed in this material are those of its authors and do not represent the views of the AAAS Board of Directors, the Council of AAAS, AAAS’ membership or the National Science Foundation.

AAAS

1200 New York Ave, NW
Washington,DC 20005
202-326-6400
Contact Us
About Us

  • LinkedIn
  • Facebook
  • Instagram
  • Twitter
  • YouTube

The World’s Largest General Scientific Society

Useful Links

  • Membership
  • Careers at AAAS
  • Privacy Policy
  • Terms of Use

Focus Areas

  • Science Education
  • Science Diplomacy
  • Public Engagement
  • Careers in STEM

Focus Areas

  • Shaping Science Policy
  • Advocacy for Evidence
  • R&D Budget Analysis
  • Human Rights, Ethics & Law

© 2023 American Association for the Advancement of Science