Intelligent Prosthetic Hand Controlled by Voice Commands
Sign up for access to the world's latest research
Abstract
This research presents the design and implementation of an intelligent electronic prosthetic hand capable of performing essential daily tasks for individuals who have lost their natural hands. The proposed system integrates artificial intelligence techniques for speech recognition to interpret voice commands and convert them into precise finger movements. The hardware is built around an Arduino Uno microcontroller, servo motors for actuation, and a Bluetooth communication module. Experimental results demonstrate that the prototype can accurately execute a set of predefined voice commands, enabling the user to control each finger independently or collectively.
Related papers
Artificial neural networks (ANNs) were used to classify EMG signals from an arm. Using an amplifier card from the Smart Hand project or prosthetic hand, 16-channel EMG signals were collected from the patients arm and Filtered. After time-domain feature extraction, simple back-propagation training was used to train the networks. During the training, the patient moved his Fingers according to a predefined pattern. After the training, the patient could move an artificial hand by duplicating the movements made during training. Artificial hands are nothing new. One of the earliest mentions is of a Roman general that fought with an iron arm back around the year 50 AD and many of researches have done on this project. Hopefully, this work will show that this approach to the problem of controlling hand prosthesis is viable and that it has benefits over other methods previously used.
Sensors, 2017
Poliarticulated prosthetic hands represent a powerful tool to restore functionality and improve quality of life for upper limb amputees. Such devices offer, on the same wearable node, sensing and actuation capabilities, which are not equally supported by natural interaction and control strategies. The control in state-of-the-art solutions is still performed mainly through complex encoding of gestures in bursts of contractions of the residual forearm muscles, resulting in a non-intuitive Human-Machine Interface (HMI). Recent research efforts explore the use of myoelectric gesture recognition for innovative interaction solutions, however there persists a considerable gap between research evaluation and implementation into successful complete systems. In this paper, we present the design of a wearable prosthetic hand controller, based on intuitive gesture recognition and a custom control strategy. The wearable node directly actuates a poliarticulated hand and wirelessly interacts with a personal gateway (i.e., a smartphone) for the training and personalization of the recognition algorithm. Through the whole system development, we address the challenge of integrating an efficient embedded gesture classifier with a control strategy tailored for an intuitive interaction between the user and the prosthesis. We demonstrate that this combined approach outperforms systems based on mere pattern recognition, since they target the accuracy of a classification algorithm rather than the control of a gesture. The system was fully implemented, tested on healthy and amputee subjects and compared against benchmark repositories. The proposed approach achieves an error rate of 1.6% in the end-to-end real time control of commonly used hand gestures, while complying with the power and performance budget of a low-cost microcontroller.
ETI - The European Technology Institute, 2023
The main goal of the current work is to design, fabricate, and control a prosthetic upper limb prototype that (as much as possible) simulates the shape, size, and motion of the natural arm, by using five servo motors. In this work, the designed and fabricated lightweight, high efficiency and low-cost five-fingered soft robotic upper prosthetic arm prototype were presented and implemented that enabled the people who have undergone an amputation in their upper arm by restoring some functions to the arm, allowing them to be completely self-sufficient without the need for any assistance from others. The fabricated arm showed high flexibility with a cosmetic shape to obtain the best possible mechanisms for grasping various objects. The produced arm is capable of performing both catching and unfolding motion in all the required degrees of motion including, the ability to move each finger (up to a single phalanx) individually to match the real arm with its motion capabilities and efficiency. The fabricated arm was controlled by the voice commands and the Arduino Uno processor using a high versatility, distinguished, and efficient programming based on the c++ language, where the code is translated into the aforementioned motions of the prosthetic arm using various motors that were connected to the fingers. The fabricated prosthetic arm was simple, responsible, and quite functional suitable for human daily activities to ensure a normal life.
International journal of engineering research and technology, 2019
Current prosthetic hands have limited functionality and are cost prohibitive. A design of a cost effective anthropomorphic prosthetic hand was created. The novel design incorporates five individually actuated fingers in addition to powered thumb roll articulation, which is unseen in commercial products. Fingertip grip force is displayed via LEDs for feedback control. The hand contains a battery and microcontroller. Multiple options for signal input and control algorithms are presented. A prototype will serve as a platform for future programming efforts. Keywords—Functionality, articulation, micro controller,
This paper exploits the growing technical ability to integrate biological systems with mechatronics to develop a functional prosthetic hand. Our focus is on reproducing human grasping operations using noninvasive electromyogram signals. The principal steps involved are: extraction and processing of electromyogram signals generated during grasping, classification of grasp types, mechatronic design of a prosthetic hand followed by design and development of an intelligent biofeedback control architecture. The control architecture comprise of a classifier based on suport vector machine and embedded motor driver circuit to actuate the prosthesis in the intended manner of grasping. We report work in progress in design and development of a biosignals controlled prosthetic hand under funding from Department of Information Technology, Government of India. As a first step towards the develoment of such a prosthesis, we have accomplished classification of six grasp types based on two channel forearm electromyogram signals with 87% accuracy; comparable to that reported in the literature and is under development. The hand is envisaged to have sensory feedback coupled with exteroceptive and pro-prioceptive sensors. The mechatronic design of the prosthetic hand is inspired by the human muscle tendon system.
SHS Web of Conferences
recent years, robots have been introduced in most factories. However, manual work still continues to be done in some places where giant robots cannot be installed. In particular, traditional Japanese crafts are done by hand, and people that engage in such crafts are called craftsmen. Generally, such artisans need years of training and cannot become experts right away. One of the problems these artisans face is the lack of successors. To address this challenge, this paper proposes a raspberry pi hardware based control method for a prosthetic hand using hand gestures from camera sensor, which will allow a prosthetic hand to learn the hand movements of the craftsmen and perform the crafts. The advantage of this is that there is no need for training, which usually takes years. To control the prosthetic hand, hand gestures are captured from a camera sensor, converted to HSV and binarized, and then classified into one of five gestures using a CNN implemented on the raspberry pi hardware. ...
Frontiers in Neurorobotics, 2022
Hand prostheses should provide functional replacements of lost hands. Yet current prosthetic hands often are not intuitive to control and easy to use by amputees. Commercially available prostheses are usually controlled based on EMG signals triggered by the user to perform grasping tasks. Such EMG-based control requires long training and depends heavily on the robustness of the EMG signals. Our goal is to develop prosthetic hands with semi-autonomous grasping abilities that lead to more intuitive control by the user. In this paper, we present the development of prosthetic hands that enable such abilities as first results toward this goal. The developed prostheses provide intelligent mechatronics including adaptive actuation, multi-modal sensing and on-board computing resources to enable autonomous and intuitive control. The hands are scalable in size and based on an underactuated mechanism which allows the adaptation of grasps to the shape of arbitrary objects. They integrate a mult...
IOP Conference Series: Materials Science and Engineering
Voice control is one of the easiest means of interacting with machines, as no extra effort is required to generate a control signal. In addition, voice control is more intuitive than other control methods. Many studies use voice recognition to control medical devices and hand prostheses in real-time, but its use has some limitations. Furthermore, some studies take advantage of inertial measurement of the body organ to control the hand prostheses. By reviewing the advantages and limitations for each control method faces, a new synchronised control system proposed, that combines voice recognition and inertial measurement based on three combination strategies to render the prosthetic hand more dexterous, feasible, and easy to use. Five participants tested the control system based on the combination strategies to perform simple and complex prosthetic hand movements. The results showed that voice recognition had about 99% accuracy and rapid response time. Moreover, the inertial measurement control system improved the accuracy of the system, increased the degrees of freedom, and made the use of the prosthetic hand easier and more feasible.
2020
About half of upper-limb (UL) amputees do not wear a prosthesis. This is, in part, related to an inability to take full functional advantage of the prosthesis. To help address this issue, we have developed the Voice Activated Prosthesis Interface (VAPI) to allow individuals to supplement their conventional control with voice commands. Specifically, this study targeted accessing multiple grip patterns in multi-articulating hands. Data from amputee test subjects is reported showing an improvement in the time to complete tasks, more accurate grip selection, and reduced frustration with the prosthesis when using the voice recognition technology compared to standard myoelectric control.
2019 First International Conference on Transdisciplinary AI (TransAI), 2019
Speech recognition is one of the key topics in artificial intelligence, as it is one of the most common forms of communication in humans. Researchers have developed many speech-controlled prosthetic hands in the past decades, utilizing conventional speech recognition systems that use a combination of neural network and hidden Markov model. Recent advancements in general-purpose graphics processing units (GPGPUs) enable intelligent devices to run deep neural networks in real-time. Thus, state-of-the-art speech recognition systems have rapidly shifted from the paradigm of composite subsystems optimization to the paradigm of end-to-end optimization. However, a low-power embedded GPGPU cannot run these speech recognition systems in real-time. In this paper, we show the development of deep convolutional neural networks (CNN) for speech control of prosthetic hands that run in real-time on a NVIDIA Jetson TX2 developer kit. First, the device captures and converts speech into 2D features (like spectrogram). The CNN receives the 2D features and classifies the hand gestures. Finally, the hand gesture classes are sent to the prosthetic hand motion control system. The whole system is written in Python with Keras, a deep learning library that has a TensorFlow backend. Our experiments on the CNN demonstrate the 91% accuracy and 2ms running time of hand gestures (text output) from speech commands, which can be used to control the prosthetic hands in real-time.

Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
References (3)
- Cheich, Arduino Book for Beginners, LLC, 2021. ISBN: 0988780615.
- David Brown, Artificial Intelligence, LLC, 2019. ISBN: 1673811116.
- Open-source resources from the Internet.