HAND GESTURE RECOGNITION SYSTEM FOR MUTE PATIENTS
2024, SSRN Electronic journal
https://doi.org/10.2139/SSRN.4685916…
4 pages
1 file
Sign up for access to the world's latest research
Abstract
One of the standard sign language techniques is the use of hand gestures. The ability of mute people to communicate with others is highly limited. This project aims to facilitate the diagnosis process of the mute patients via using hand gesture recognition system that housed in a right-hand glove and contain five flex sensors for each finger, the system is designed to recognize eleven Arabic sign language letters that represent different phrases, converted to audio to help the doctor to make the right diagnosis, The system will compare the identified signal with stored data when it's matched an mp3 audio file will be played as an output. This recognition system is designed to identify eleven hand gesture-already familiar to the patients-each of these gestures represent a phrase that is an answer to question asked frequently when seeing the general practitioner, the system has converted all of the sample successfully with accuracy of 90% .
Related papers
Deaf and dumb people use sign language for their communication but it was difficult to understand by the normal people. The aim of this project is to reduce the barrier between in them. The main objective of this project is to produce an algorithm for recognition of hand gestures with accuracy. In this project has a hand gloves model for gesture recognition. MEMS sensor is used to detecting the hand motions based on the stress. This project is used for the deaf and dumb people to communicate with normal people. The hand motions are detected by the MEMS sensor and the values are stored on the microcontroller memory unit. The output voices are previously stored on the voice processor unit. Depends on the hand motions the output will be displayed on the LCD and also played through the speaker.
Communications in Computer and Information Science, 2019
Around 5% of people across the globe have difficulty in speaking or are unable to speak. So, to overcome this difficulty, sign language came into the picture. It is a method of non-verbal communication which is usually used by deaf and mute people. Another problem that arises with sign language is that people without hearing or speaking problems do not learn this language. This problem is severe as it creates a barrier between them. To resolve this issue, this paper makes use of computer vision and machine learning along with Convolutional Neural Network. The objective of this paper is to facilitate communication among deaf and mute and other people. For achieving this objective, a system is built to convert hand gestures to voice with gesture understanding and motion capture. This system will be helpful for deaf and mute people as it will increase their communication with other people.
International Journal for Research in Applied Science & Engineering Technology (IJRASET), 2022
Rising incidents of visual and hearing imparity is a matter of global concern. India itself has around 12 million visually impaired people and over 21 million people are either blind or dead or both. For the blind people, there are various solutions existing such as eye-donation, and hearing aid for the deaf but not everyone can afford it. The purpose of our project is to provide an effective method of communication between the natural people and the impaired people. According to a research article in the "Global World" on January 4,2017 with a deaf community of millions, hearing India is only just beginning to sign. So, to address this problem, we are coming forth with a model based on modern and advanced technologies like machine learning, image processing, artificial intelligence to provide a potential solution and bridge the gap of communication. The sign method is the most accepted method as a means of communication to impaired people. The model will give out the output in the form of text and voice in regional as well as English languages so it can have an effect on the vast majority of the population in rural as well as urban India. This project will definitely provide accessibility, convenience, safety to our visually impaired brothers and sisters who are looked upon by the society just because of their disability.
International Journal of Innovative Technology and Exploring Engineering
Sign language and facial expressions are the major means of communication for the speech flawed people. General people can understand the facial expression to an extent but cannot understand the sign language. Dumb people are unable to express their thoughts to normal humans. To reduce this gap of communication, this paper presents an electronic system which will help the mute people to exchange their ideas with the normal person in emergency situations. The system consists of a glove that can be worn by the subject which will convert the hand gestures to speech and text. The message displayed will also help deaf people to understand their thoughts. This prototype involves raspberry pi 3 as the micro-controller along with the flex sensors, accelerometer sensor. The resistance of the flex sensor changes due to the bending moment of the fingers of the subject. The accelerometer measures the angular displacement of the wrist along the y-axis. The microcontroller takes the input from th...
Generally dumb people use sign language for communication but they find difficulty in communicating with others who don't understand sign language. This project aims to lower this barrier in communication. It is based on the need of developing an electronic device that can translate sign language into speech in order to make the communication take place between the mute communities with the general public possible. A Wireless data gloves is used which is normal cloth driving gloves fitted with flex sensors along the length of each finger and the thumb. Mute people can use the gloves to perform hand gesture and it will be converted into speech so that normal people can understand their expression. Sign language is the language used by mute people and it is a communication skill that uses gestures instead of sound to convey meaning simultaneously combining hand shapes, orientations and movement of the hands, arms or body and facial expressions to express fluidly a speaker's thoughts. Signs are used to communicate words and sentences to audience. A gesture in a sign language is a particular movement of the hands with a specific shape made out of them. A sign language usually provides sign for whole words. It can also provide sign for letters to perform words that don't have corresponding sign in that sign language. In this project Flex Sensor plays the major role, Flex sensors are sensors that change in resistance depending on the amount of bend on the sensor. The final implemented design is using cupper plate based glove. This glove can be made using small metal strips that are fixed on the five fingers of the glove. It is better to use a ground plate instead of individual metal strips is because the contact area for ground will be more facilitating easy identification of finger position. We are in process of developing a prototype using this process to reduce the communication gap between differentially able and normal people
This paper presents a sign to speech converter for dumb people.[1] In the present world it is very difficult for the dumb people to talk with the ordinary people. So it becomes impossible for them to communicate with the ordinary people unless and until ordinary people like us learn the sign language for the purpose of communication. The sign language of dumb is quite difficult to learn and it is not possible for everybody to learn that language. So every person cannot come and share their thoughts with these physically impaired people. So here is a system which would enable the dumb people to communicate with each and every one.[2] In this system a webcam is placed in front of the physically impaired person. The physically impaired person would put his finger in front of the web camera and the webcam will capture the hand gesture and perform image processing using principle component analysis algorithm (PCA).[3] The coordinates captured will be mapped with the one previously stored and accordingly exact picture from the database will be identified. Continuing in this way physically impaired person will be able to go through the entire sentence that he wants to communicate. Later on this sentence will be translated into speech so that it would be audible to everyone..
The Proposed system introduces an efficient and fast algorithm for identification of the number of fingers opened in a gesture representing an alphabet of the Binary Sign Language. The idea consisted of designing and building up an intelligent system using group of Flex sensor, machine learning and artificial intelligence concepts to take visual inputs of sign language's hand gestures and generate easily recognizable form of outputs. The objective of this project is to develop an intelligent system which can act as a translator between the sign language and the spoken language dynamically and can make the communication between people with hearing impairment and normal people both effective and efficient. After recognizing the gesture the output are expressed in terms of voice and text for display.
Generally dumb people use sign language for communication but they find difficulty in communicating with others who don't understand sign language. This project aims to lower this barrier in communication. It is based on the need of developing an electronic device that can translate sign language into speech in order to make the communication take place between the mute communities with the general public possible. A Wireless data gloves is used which is normal cloth driving gloves fitted with flex sensors along the length of each finger and the thumb. Mute people can use the gloves to perform hand gesture and it will be converted into speech so that normal people can understand their expression. Sign language is the language used by mute people and it is a communication skill that uses gestures instead of sound to convey meaning simultaneously combining hand shapes, orientations and movement of the hands, arms or body and facial expressions to express fluidly a speaker's thoughts. Signs are used to communicate words and sentences to audience. A gesture in a sign language is a particular movement of the hands with a specific shape made out of them. A sign language usually provides sign for whole words. It can also provide sign for letters to perform words that don't have corresponding sign in that sign language. In this project Flex Sensor plays the major role, Flex sensors are sensors that change in resistance depending on the amount of bend on the sensor. The final implemented design is using cupper plate based glove. This glove can be made using small metal strips that are fixed on the five fingers of the glove. It is better to use a ground plate instead of individual metal strips is because the contact area for ground will be more facilitating easy identification of finger position. We are in process of developing a prototype using this process to reduce the communication gap between differentially able and normal people
Journal of emerging technologies and innovative research, 2018
The people who unable to speak and hear are called speech and hearing impaired people. These people face a lot of problems while live in the world. The main problem comes about the communication. Because language they are used to communicate is a sign language which is difficult to understand by normal people without knowledge. The main goal of this project is to make the reliable communication between normal and speech impaired people. To achieve this goal, specially designed data glove is required which consist of flex sensors, 3-axis accelerometer. Here, Template matching algorithm is used to handle and process the sensor data. By using this sensors, system grown up with the real-time communication by the simple way. IndexTerms Sign language, Gesture recognition, Flex Sensor, Data glove, Template matching, Vision and Glove based Systems.
For many deaf and dumb people, sign language is the principle means of communication. Normal people in such cases end up facing problems while communicating with speech impaired people. In our proposed system, we can automatically recognize sign language to help normal people to communicate more effectively with speech impaired people. This system recognizes the hand signs with the help of specially designed gloves. These recognized gestures are translated into a text and voice in real time. Thus this system reduces the communication gap between normal and the speech impaired people.

Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
References (7)
- K. Kozik. (2019) Without Sign Language, Deaf People Are Not Equal. Human Rights Watch. Available: https://www.hrw.org/news/2019/09/23/without-sign-language-deaf- people-are-not-equal
- U. Von Agris, J. Zieren, U. Canzler, B. Bauer, and K.-F. Kraiss, "Recent developments in visual sign language recognition," Universal Access in the Information Society, vol. 6, no. 4, pp. 323-362, 2008.
- S. A. Sheikh, M. Sahidullah, F. Hirsch, and S. Ouni, "Machine learning for stuttering identification: Review, challenges & future directions," arXiv preprint arXiv:2107.04057, 2021.
- S. M. Kennison, Introduction to language development. Sage Publications, 2013.
- J. Grippo, M. Vergel, H. Comar, and T. Grippo, "Mutism in children," Revista de neurologia, vol. 32, no. 3, pp. 244-246, 2001.
- M. A. Abdel-Fattah, "Arabic sign language: a perspective," Journal of deaf studies and deaf education, vol. 10, no. 2, pp. 212-221, 2005.
- H. Cooper, B. Holt, and R. Bowden, "Sign language recognition," in Visual analysis of humans: Springer, 2011, pp. 539-562.