Discipline: Technology and Engineering
Subcategory: Biomedical Engineering
Lonika Behera - University of the District of Columbia
Co-Author(s): Pa'tron C. Johnson, Mehdi Badache, and Laura Rojas, University of the District of Columbia, Washington, DC
Hypothesis statement and why the research is important: We hypothesized that a 3-D printed, Open Bionics (robotic) prosthetic hand could accomplish common hand movements and gestures needed for daily-tasks and functions. In the United States, roughly 2 million individuals are currently suffering from limb loss. In particular, upper-limb amputations account for approximately 70% of trauma-related amputations and close to 60% of congenital amputations. There is a high-demand for prosthetics to become more life-like and have higher functionality within in one’s daily-living environment(s).
Methods and controls: All experiments were conducted within the Center for Biomechanical & Rehabilitation Engineering (CBRE) at the University of the District of Columbia. For control of the prosthetic, robotic hand, an Arduino programming environment and ATMEGA2650 microcontroller was used. Hand movements from two subjects (controls) were compared to the robotic hand movements for the following gestures: fist grip; palm grip; pinch grip; finger-pulls; and key-board typing. To quantify each of the gestures, forces were acquired, using pressure mapping sensors and dynamometers, then compared for the robotic hand relative to controls.
Results: The robotic hand motion was observed have limited, binary (‘on’ or ‘off’) type closures. This affected the types of forces that were exerted relative to controls which could apply a broader range of forces for the different gestures. Further, full closure (e.g., during the fist grip) was difficult to achieve and therefore did not exert the same level of forces as controls.
Conclusions and future research questions: As a step towards investigations using the prosthetic hand on human on upper-limb amputees, our future work will include use of wireless surface electromyographic (EMG) in conjunction with the prosthetic hand. This may, potentially, allow for robotic hand outputs closer to what would be seen in non-amputees.
Funder Acknowledgement(s): This study was supported by an NSF grant: HRD- 1533479 NSF HBCU-UP Targeted Infusion Project entitled ‘The Integration, Cultivation, and Exposure to Biomedical Engineering at University of the District of Columbia’, awarded to the Principal Investigator, Lara A. Thompson, Assistant Professor of Mechanical Engineering, Program Director of the Biomedical Engineering Program, and the Director for the Center of Biomechanical and Rehabilitation Engineering (CBRE), University of the District of Columbia (UDC), Washington, DC.
Faculty Advisor: Lara Thompson, lara.thompson@udc.edu
Role: I first conducted background literature searches to become more familiar with current investigations involving upper-limb prosthetics. For this research, I designed the hand gesture experiments: 1) Determined which hand movement gestures to be tested. 2) Experimental design included the selection of the appropriate sensors and equipment. 3) Acquired data for both (human) controls and robotic hand. 4) Analyzed the data to obtain results. After obtaining our results, I interpreted the differences between control and the robotic hand data for the various hand gestures.