Issue |
SHS Web Conf.
Volume 139, 2022
The 4th ETLTC International Conference on ICT Integration in Technical Education (ETLTC2022)
|
|
---|---|---|
Article Number | 03015 | |
Number of page(s) | 8 | |
Section | Topics in Computer Science | |
DOI | https://doi.org/10.1051/shsconf/202213903015 | |
Published online | 13 May 2022 |
A Low-cost Raspberry Pi-based Control System for Upper Limb Prosthesis
Adaptive Systems Laboratory, School of Computer Science and Engineering, The University of Aizu, Japan
a) Corresponding author: s1260084@u-aizu.ac.jp
b) Electronic mail: d8211104@u-aizu.ac.jp
c) Electronic mail: benab@u-aizu.ac.jp
recent years, robots have been introduced in most factories. However, manual work still continues to be done in some places where giant robots cannot be installed. In particular, traditional Japanese crafts are done by hand, and people that engage in such crafts are called craftsmen. Generally, such artisans need years of training and cannot become experts right away. One of the problems these artisans face is the lack of successors. To address this challenge, this paper proposes a raspberry pi hardware based control method for a prosthetic hand using hand gestures from camera sensor, which will allow a prosthetic hand to learn the hand movements of the craftsmen and perform the crafts. The advantage of this is that there is no need for training, which usually takes years. To control the prosthetic hand, hand gestures are captured from a camera sensor, converted to HSV and binarized, and then classified into one of five gestures using a CNN implemented on the raspberry pi hardware. The recognized gesture is then relayed to the prosthetic hand to mimic the classified gesture. A dataset containing 2000 captured images of each gesture was created to evaluate the performance, and these gestures clearly define the closing and opening of the fingers. Using a 32×32 hand gesture image dataset captured from camera, we validated the trained CNN first in software for hand recognition without using Raspberry Pi, and achieved an accuracy of 99.63%, and then implemented on the raspberry pi, and performed real-time evaluation by recognizing five hand gestures captured from the camera sensor in real-time. Out of the hand gestures, four were correctly recognized. We presented the design of a low-cost prosthetic hand based on raspberry pi hardware, and evaluated its real-time hand gesture recognition. The evaluation result show that the proposed system is able to correctly recognize four hand gestures.
Key words: Deep Neural Network / Prosthesis / Hardware / Frame-based / Real-time
© The Authors, published by EDP Sciences, 2022
This is an Open Access article distributed under the terms of the Creative Commons Attribution License 4.0, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.