An approach to natural gesture in virtual environments
1995, ACM Transactions on Computer-Human Interaction
https://doi.org/10.1145/210079.210080…
22 pages
1 file
Sign up for access to the world's latest research
Abstract
This article presents research-an experiment and the resulting prototype-on a method for treating gestural input so that it can be used for multimodal applications, such as interacting with virtual environments.
Related papers
2016
The steady growth of technology has allowed to extend all forms of human-computer communication. Since the emergence of more sophisticated interaction devices, Human Computer Interaction (HCI) science has added the issue of Non-Verbal Communication (NVC). Nowadays, there are a lot of applications such as interactive entertainments and virtual reality requiring more natural and intuitive interfaces. Human gestures constitute a great space of actions expressed by the body, face, and/or hands. Hand Gesture is frequently used in people’s daily life, thus it is an alternative form to communicate with computers in an easy way. This paper introduces a real-time hand gesture recognition and tracking system to identify different and dinamic hand postures. In order to improve the user experience, a set of different system functions into a virtual world had been implemented so interaction can be performed by the user through a data glove device.
Computación y Sistemas, 2018
Many interaction techniques have been developed for virtual worlds including the use of novel devices. Nowadays, technological development has placed us in a time where the interaction devices are no longer available just to high technology laboratories. In this context, today we can develop solutions for natural user interfaces and its massive adoption presents research challenges. In this paper we analyze the use of gesture-based interaction for the navigation of virtual worlds. For them we have created a virtual world and contrasted the use of interactive interfaces based on gesture of hands or body, as well as interaction based on mouse and keyboard. The results found indicate that the natural is not as it is even though we imitate what we do in real life.
Lecture Notes in Computer Science, 2002
This paper describes the development of a natural interface to a virtual environment. The interface is through a natural pointing gesture and replaces pointing devices which are normally used to interact with virtual environments. The pointing gesture is estimated in 3D using kinematic knowledge of the arm during pointing and monocular computer vision. The latter is used to extract the 2D position of the user's hand and map it into 3D. Off-line tests of the system show promising results with an average errors of 76mm when pointing at a screen 2m away. The implementation of a real time system is currently in progress and is expected to run with 25Hz.
2011
T h e R e s e a r c h B u l l e t i n o f J o r d a n A C M , I S S N : 2 0 7 8 -7 9 5 2 , V o l u m e I I ( I I I ) P a g e | 86 ABSTRACT Virtual Environment (VE) system offers a natural and intelligent user interface. Hand gesture recognition is more efficient and easier interaction in VE than human-computer interface (HCI) devices like keyboards and mouses. We propose a hand gesture recognition interface that generates commands to control objects directly in a game. Our novel hand gesture recognition system utilizes both Bag-offeatures and Support Vector Machine (SVM) to realize user-friendly interaction between human and computers. The HCI based on hand gesture recognition interacts with objects in a 3D virtual environment. With this interface, the user can control and direct a helicopter by a set of hand gesture commands controlling the movements of the helicopter. Our system shows the hand gesture recognition interface can attain an enhanced and more intuitive and flexible interaction for the user than other HCI devices.
Journal of Artificial Intelligence and Systems
User interface has special importance in immersive virtual environments. Interactions based on the simple and conceivable gestures of a hand may enhance immersivity of a Virtual Environment (VE). However, due to the structural issues like small size and complex shape of human hand, recognition of hand gestures are more challenging. This work introduces a novel interaction technique to perform the basic interaction tasks by the simple movement of hand instead of distinct gestures. With an ordinary camera, the fist posture of hand is segmented out from the image stream using the optimal segmentation model. Like pressing a button with a thumb, the status of thumb is traced for the activation or deactivation of the interactions. After the activation of interaction, the trajectory of hand is followed to manipulate a virtual object about an arbitrary axis. Without training and comparison of gestures, the basic interactions required in a VE are performed by the perceptive movement of a hand. By incorporating image processing in the realm of VE, the technique is implemented in a case-study project; FIRST (Feasible Interaction by Recognizing the Status of Thumb). A group of 12 users evaluated the system in a moderate lighting condition. Outcomes of the evaluation revealed that the technique is suitable for Virtual Reality (VR) applications.
PRZEGLĄD ELEKTROTECHNICZNY
The main novelty presented in this paper is application and evaluation our Gesture Description Language (GDL) classifier in the role of touchless interface for virtual reality (VR) environment. In our VR system whole interaction is done by gestures and body movements analysis (so called natural user interface). For our needs we have adapted semi-realistic VR system (block engine). We have tested different aspects of proposed interface on a group of 26 persons with wide range of age (from 5 years to 40+) and both sexes. The results we obtained prove that GDL can be successfully applied in systems that require real time action recognition especially educational software games that aim at increasing the students' motivation and engagement while they learn. Streszczenie. W tym artykule autorzy prezentują zastosowanie klasyfikatora o nazwie Język Opisu Gestów (GDL) w roli bezdotykowego interfejsu systemu wirtualnej rzeczywistości. Całość interakcji z zaprezentowanym w tej pracy systemem odbywa się poprzez rozpoznanie i analizę ruchu użytkownika (jest to tak zwany interfejs naturalny). Klasyfikator GDL został przetestowany w pseudo-realistycznym, wirtualnym środowisku, na grupie 26 osób obu płci w przedziale wiekowym od 5 do 40+ lat. Otrzymane rezultaty dowodzą, że zaproponowane podejście może być z powodzeniem użyte w aplikacjach wykorzystujących wirtualną rzeczywistość, w szczególności w grach edukacyjnych, których celem jest uatrakcyjnienie procesu zdobywania wiedzy. (Ocena działania Języka Opisu Gestów w roli bezdotykowego interfejsu w środowisku wirtualnej rzeczywistości).
International Journal of Interactive Multimedia and Artificial Intelligence, 2019
Three Dimensional (3D) interaction is the plausible human interaction inside a Virtual Environment (VE). The rise of the Virtual Reality (VR) applications in various domains demands for a feasible 3D interface. Ensuring immersivity in a virtual space, this paper presents an interaction technique where manipulation is performed by the perceptive gestures of the two dominant fingers; thumb and index. The two fingertip-thimbles made of paper are used to trace states and positions of the fingers by an ordinary camera. Based on the positions of the fingers, the basic interaction tasks; selection, scaling, rotation, translation and navigation are performed by intuitive gestures of the fingers. Without keeping a gestural database, the features-free detection of the fingers guarantees speedier interactions. Moreover, the system is user-independent and depends neither on the size nor on the color of the users' hand. With a case-study project; Interactions by the Gestures of Fingers (IGF) the technique is implemented for evaluation. The IGF application traces gestures of the fingers using the libraries of OpenCV at the back-end. At the front-end, the objects of the VE are rendered accordingly using the Open Graphics Library; OpenGL. The system is assessed in a moderate lighting condition by a group of 15 users. Furthermore, usability of the technique is investigated in games. Outcomes of the evaluations revealed that the approach is suitable for VR applications both in terms of cost and accuracy.
Human-Computer Interaction – INTERACT 2021, 2021
We explore gestures as interaction methods in virtual reality (VR). We detect hand and body gestures using human pose estimation based on offthe-shelf optical camera images using machine learning, and obtain reliable gesture recognition without additional sensors. We then employ an avatar to prompt users to learn and use gestures to communicate. Finally, to understand how well gestures serve as interaction methods, we compare the studied gesture-based interaction methods with baseline common interaction modalities in VR (controllers, gaze interaction) in a pilot study including usability testing.
Engineering and Scientific International Journal - Divya Udayan J , 2020
Recent development in virtual reality (VR) interaction with 3D camera and sensors like kinect, range camera, leap motion controller etc., has enabled opportunity in development of human computer interaction (HCI) application. Hand gesture is one of the popular ways that people use to interact with the computer. Even automatic hand gesture recognition appears as a suitable means for interacting with virtual reality systems. This paper focuses on the study and analysis of the application based on gesture interaction technology in virtual reality. Customizing gestures for pointing, grabbing, zoom in/out, swap were defined and implemented in unity 3D with leap motion SDK. The effectiveness of the hand gesture was analyzed through recording user experience and questionnaire.

Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
References (9)
- BADLER, P. AND WEBBER, B. L. 1993. Simulating Humans: Computer Graphzcs An~mat~on and Control. Oxford University Press, New York.
- BOS, E. 1992. Some virtues and limitations of action inferring interfaces. In Proceedings of UIST '92 (Monterey, Calif., Nov. 15-18). ACM Press, New York, 79-88.
- BOLT, R. A. 1984. The Human Interface. Van Nostrand Reinhold, New York.
- BOLT, R. A. AND HERRANZ, E. J. 1992. Two-handed gesture with speech in multi-modal natural dialogue. In proceedings of UIST '92. ACM Press, New York.
- CODELLA, C., JALILI, R., KOVED, L., LEWIS, B., LING, D. T., LIPSCOMB, J. S., RABENHORST, D. A., WANG, C. P., NORTON, A., SViEEN~Y, P., AND TURK, G. 1992. Interactive simulation in a multi-person virtual world. In Proceedings of CHZ '92. ACM Press, New York, 329-334.
- DARRELL, T. AND PENTLAND, A. 1993. Space-time gestures. In the IEEE Conference on Vision and Pattern Recognition (June). IEEE, New York.
- FRENKEL, K. A. 1994. A conversation with Brenda Laurel. Interactions 1, 1 (Jan.), 45-52.
- KOONS, D. B., SPARRELL, C. J., AND THORISSON, K. R. 1993. Integrating simultaneous input from speech, gaze, and hand gestures. In Intelligent MuhMedza Interfaces, M. Maybury, Ed.
- KRUEGER, M. 1991. Artificial Reality. Addison-Wesley, Reading, Mass. ACM Transactions on Computer-Human InteractIon, Vol. 2, No. 3, September 1995.