Discipline: Technology and Engineering
Subcategory: Computer Engineering
Norbu Tsering - Borough of Manhattan Community College, CUNY
Co-Author(s): Feng Hu, The Graduate Center, CUNY, NY Hao Tang, Borough of Manhattan Community College, CUNY, NY
Visually impaired people have a need for an easy way to localize and navigate themselves in new environments. There has been much work done to try to solve this problem, using different sensors. Previous work in the computer vision community uses 2D images for localization and navigation, which is challenging due to lack of texture in the indoor environments. In this project, we use full 3D information captured by a tablet with a depth sensor to localize the device itself. In order to facilitate navigation for the visually impaired, we design, implement and evaluate a way to calculate the position and orientation of the device with the following two steps: (1) use a tablet with depth sensor pre-build a large 3D indoor environment; (2) apply Iterative Closest Point algorithm on a newly captured RGB-D data to calculate the user’s new position and orientation. The system includes three components: environmental modeling, pose estimation algorithm, and GUI design. The experiment tested with real model within a university laboratory shows a real time and accurate performance. In our implementation, we have a reconstructed global model generated by stitching RGB-D data of frames together with the camera pose data of each frame. A tablet, with a 3D sensor on-board, is adopted in our experiments. We have built a few large global 3D models in a campus building. We then captured some RGB-D frames at different locations and apply the aforementioned registration method estimate where each RGB-D frame is captured, that is the location of the user system can estimate the correct locations in the experiments. In this paper, proposed a new method of localization for visually impaired using a tablet with sensor. We believe the study is important for the visually impaired, since portable depth sensors have become popular on some tablets and smart-phones. As ongoing work, we are expanding the testing database to larger environments, e.g. a whole campus building.
Funder Acknowledgement(s): This work is supported by NSF EFRI-REM under Award # EFRI-1137172, to the City College of New York.
Faculty Advisor: Zhigang Zhu, zzhu@ccny.cuny.edu