Emerging Researchers National (ERN) Conference

nsf-logo[1]

  • About
    • About AAAS
    • About the NSF
    • About the Conference
    • Partners/Supporters
    • Project Team
  • Conference
  • Abstracts
    • Abstract Submission Process
    • Presentation Schedules
    • Abstract Submission Guidelines
    • Presentation Guidelines
    • Undergraduate Abstract Locator (2020)
    • Graduate Abstract Locator (2020)
    • Faculty Abstract Locator (2020)
  • Travel Awards
  • Resources
    • App
    • Award Winners
    • Code of Conduct-AAAS Meetings
    • Code of Conduct-ERN Conference
    • Conference Agenda
    • Conference Materials
    • Conference Program Books
    • ERN Photo Galleries
    • Events | Opportunities
    • Exhibitor Info
    • HBCU-UP/CREST PI/PD Meeting
    • In the News
    • NSF Harassment Policy
    • Plenary Session Videos
    • Professional Development
    • Science Careers Handbook
    • Additional Resources
    • Archives
  • Engage
    • Webinars
    • Video Contest
    • Video Contest Winners
    • ERN 10-Year Anniversary Videos
    • Plenary Session Videos
  • Contact Us

RGB-D Sensor Based Indoor Localization for Visually Impaired

Undergraduate #350
Discipline: Technology and Engineering
Subcategory: Computer Engineering

Norbu Tsering - Borough of Manhattan Community College, CUNY
Co-Author(s): Feng Hu, The Graduate Center, CUNY, NY Hao Tang, Borough of Manhattan Community College, CUNY, NY



Visually impaired people have a need for an easy way to localize and navigate themselves in new environments. There has been much work done to try to solve this problem, using different sensors. Previous work in the computer vision community uses 2D images for localization and navigation, which is challenging due to lack of texture in the indoor environments. In this project, we use full 3D information captured by a tablet with a depth sensor to localize the device itself. In order to facilitate navigation for the visually impaired, we design, implement and evaluate a way to calculate the position and orientation of the device with the following two steps: (1) use a tablet with depth sensor pre-build a large 3D indoor environment; (2) apply Iterative Closest Point algorithm on a newly captured RGB-D data to calculate the user’s new position and orientation. The system includes three components: environmental modeling, pose estimation algorithm, and GUI design. The experiment tested with real model within a university laboratory shows a real time and accurate performance. In our implementation, we have a reconstructed global model generated by stitching RGB-D data of frames together with the camera pose data of each frame. A tablet, with a 3D sensor on-board, is adopted in our experiments. We have built a few large global 3D models in a campus building. We then captured some RGB-D frames at different locations and apply the aforementioned registration method estimate where each RGB-D frame is captured, that is the location of the user system can estimate the correct locations in the experiments. In this paper, proposed a new method of localization for visually impaired using a tablet with sensor. We believe the study is important for the visually impaired, since portable depth sensors have become popular on some tablets and smart-phones. As ongoing work, we are expanding the testing database to larger environments, e.g. a whole campus building.

Funder Acknowledgement(s): This work is supported by NSF EFRI-REM under Award # EFRI-1137172, to the City College of New York.

Faculty Advisor: Zhigang Zhu, zzhu@ccny.cuny.edu

ERN Conference

The 2022 ERN Conference has been postponed.

Full Notice

What’s New

  • Congratulations to Zakiya Wilson-Kennedy on her 2021 AAAS Fellowship
  • Event Vaccination and Liability Policy
  • Webinars
  • Events|Opportunities
  • AAAS CEO Comments on Social Unrest, Racism, and Inequality
  • Maintaining Accessibility in Online Teaching During COVID-19
  • In the News
  • HBCU/CREST PI/PD Meeting

Conference Photos

ERN Conference Photo Galleries

Awards

ERN Conference Award Winners

Checking In

nsf-logo[1]

This material is based upon work supported by the National Science Foundation (NSF) under Grant No. DUE-1930047. Any opinions, findings, interpretations, conclusions or recommendations expressed in this material are those of its authors and do not represent the views of the AAAS Board of Directors, the Council of AAAS, AAAS’ membership or the National Science Foundation.

AAAS

1200 New York Ave, NW Washington,DC 20005
202-326-6400
Contact Us
About Us

The World's Largest General Scientific Society

Useful Links

  • Membership
  • Careers at AAAS
  • Privacy Policy
  • Terms of Use

Focus Areas

  • Science Education
  • Science Diplomacy
  • Public Engagement
  • Careers in STEM

 

  • Shaping Science Policy
  • Advocacy for Evidence
  • R&D Budget Analysis
  • Human Rights, Ethics & Law
© 2022 American Association for the Advancement of Science