Discipline: Computer Sciences and Information Management
Subcategory: Computer Science & Information Systems
Session: 2
Room: Exhibit Hall A
Elise Olivolo - Fayetteville State University
The extensive challenges that face robotics research now are based on new technologies such as Mobile, Bio-hybridization, Artificial Intelligence (AI), and Brain-Computer interfaces. The research project’s objective was to create a database of household items while gathering desired characteristics, utilizing SoftbankRobotics’ (NAO) humanoid robot. The NAO Robot was given a program to successfully recognize, label, and categorize objects based on the item’s characteristics. There was a selection of scenery that was created for this project that included: kitchen, luggage, bedrooms, furniture, and an office setting. Tests results regarding NAO’s video quality demonstrated a limitation in the ability to detect visual texture along a range of shapes varying in difficulty; such as long, skinny, spiral-like objects. An array of other factors that also interfered with streamlining NAO’s recognition software included the angles at which the robot viewed a particular object. Lighting and distance were also a factor of the item being detected. However, the more input and training NAO received on a magnitude of items, the higher the success rate of item identification. This was done through training and creation of the database with 250 objects of various angles and limitations of light. Our future goals are to train NAO to take the knowledge of the previous object database and take it a step further. NAO will be able to visually recognize live actions done through sequences along with NAO recognizing and responding with the correct action that is displayed. For this project, we tested the basic actions that NAO was capable of completing by using the original functions stored on the robot. However, altering the code produced a greater result in completing the set objectives. By altering the original code, the robot showed efficiency in gathering data and forming a successful internal database with NAO identifying objects and describing presented scenarios.
Funder Acknowledgement(s): Funder Acknowledgement(s): This research project was supported by the HBCU-UP Targeted Infusion Project (Award Id: 1818694)
Faculty Advisor: Dr. Bhattacharya, sbhattac@uncfsu.edu
Role: In this research I took 250 items and took pictures of the objects using NAO the humanoid robot to visually recognize the items regardless of the position. This includes their 360 degree span of the object, the objects in different lighting, and the object at certain distances (up to a foot). By doing so I created a database of items that NAO was proven to have 100% successfully learned. With having a database of items that NAO visually recognized, I edited the code for NAO to take the objects that were known to group them by scenery descriptions, which I had set to kitchen, bedroom, office, and luggage.