A Study on Motion-Based UI for Running Games with Kinect
Sign up for access to the world's latest research
Abstract
This study examines the efficiency of human motion-based UI for video games with motion capture system, Kinect. We took an investigation to play with the Kinect sensor in the running game which was developed and designed using two kinds of UI. One UI consists of more intuitive and familiar motions such as turning and jumping. The other UI consists of arm motions like raising hands. As a result, UI with arm motions was easier for users to master and results in higher success rates to play than the other UI. Therefore we can conclude when a game is developed using Kinect and its UI is configured with motion recognition, the motion with the arms rather than the other parts of the body helps player better to enhance the play skills and immerse in the game.
Related papers
Computer Animation and Virtual Worlds, 2013
Recent years have witnessed great improvements in ways of game controlling yielding to higher level of interaction. Release of motion controller devices radically changed the conventional ways of interaction that have been used for controlling games so far, also giving developers the opportunity of exploring various new possible ways of interaction. One of these off the shelf tools, Microsoft Kinect for Xbox 360, recognizes motions of the players as game controlling inputs. Although touchless interaction is perceived to be attractive, games that mimic real life activities such as table tennis, sword fighting, baseball and golf may benefit from the player's holding a tangible object to get v more involved into game, sensing the actions deeply. In this thesis, a tangible gameplay interaction method that senses whether or not the player holds an object in the hand; if so, detects its dimensions and incorporates the hand-held object into gameplay by projecting motions of the player accordingly, is developed using Microsoft Kinect for Xbox 360. Developed algorithm is implemented on an experimental game and a user study is performed which revealed that an improved gameplay with more natural and accurate motion controlling yielding to new possible actions is achieved with the developed system.
In this study, a somatosensory game system is proposed by using Kinect, Unity and Motion Builder. Kinect is a somatosensory device that is our primary interactive device and manipulation functions can plugin on Unity software. Unity is a good and friendly open source and is useful for a game development platform. We use this platform to construct the virtual characters, weapons and scenarios. The motion builder is applied to record the basic tricks and monster attacks. The purpose of this study was to create a sport and have fun with the monster fighting game system. We use Kinect to detect the depth of the human skeleton and apply the information of skeleton to the virtual character of Unity in order to control martial arts that are performed by hose virtual characters. The detection methods of Kinect sensing the position of the player’s hands and virtual weapons to fight with the monster are proposed. Kinect also provides a wide variety of weapons with specific effects, which enhancing the likelihood for the player to defeat monsters. Thus players take corresponding actions in order to survive. The results of the integration of Unity and the motion builder software with Kinect-based somatosensory game support the feasibility and efficiency of the proposed method.
Proceedings of International Conference on Artificial Life and Robotics
Augmented reality (A.R.) is the underlying technique where 3D virtual objects are integrated in real-time with a real environment. Augmented reality applications such as medical visualization, maintenance and repair, robot path planning, entertainment, military aircraft navigation, and targeting applications have been proposed. This paper introduces the development of an augmented reality game that allows the user to carry out arm exercises using a natural user interface based on Microsoft Kinect. The system has been designed as an augmented game in which the user's hands are in a world augmented with virtual objects generated by computer graphics. The player is sitting in a chair, just grasping the yellow stars that are displayed in the stage. It encourages the activities of a large number of arm muscles which will prevent decay. It is also suitable for rehabilitation.
International Journal of Exercise Science, 2015
Increasing popularity of active video game use as a mode of physical activity prompted this investigation into the physiological differences to playing the Nintendo Wii TM and XBox Kinect TM. Differences in motion capture technology between these systems suggests that using one may result in different movement patterns, and therefore physiological responses, than the other. The purpose of this study was to compare the average (10 minute) and peak heart rate (HR, bpm), oxygen consumption (VO2 mL. kg-1. min-1), and energy expenditure (EE, kcal. kg-1. hr-1), while playing Boxing and Just Dance 2 (JD2) on the Wii TM and Kinect TM. Fifteen college students (7 female, 8 male) completed 10-minute game sessions for Wii TM and Kinect TM Boxing, and Wii TM and Kinect TM JD2, in random order. Comparisons for average and peak HR, VO2, and EE were made. Average and peak HR, VO2, and EE were greater (p<0.05) while playing Boxing on the Kinect TM when compared to Boxing on the Wii TM. Average and peak VO2 and EE were greater (p<0.05) while playing JD2 on the Kinect TM when compared to JD2 on the Wii TM. Peak VO2 surpassed the moderate exercise intensity threshold only while playing Kinect TM Boxing and Kinect TM JD2. Higher physiological responses were experienced when playing Boxing and JD2 on the Kinect TM versus the Wii TM. When using active video games as a form of physical activity, these findings demonstrate that the Kinect TM is a better choice than the Wii TM .
Lecture Notes in Computer Science, 2009
This paper is an evaluation of full body interactive games using Kroflič's and Laban's framework of Body, Space, Time and Relationship. An experiment with 8 participants playing 10 games for 20 minutes was conducted and recorded to digital video. Body, Space and Time elements have been measured using observation, motion tracking and Quantity of Motion (QoM). The results from the experiment informed the designer about the participants' physical experience through the analysis of postures used in each game, the quality of the movement, the body parts used in the interaction, the playing area, the direction of movement, direction of gaze, tempo, dynamics and QoM. The experiment informed the designer about important issues of the user's physical experience and proved that the method can provide useful information in the development and evaluation of full body interactive games. The theoretical work of Laban and Kroflič also proved useful for interaction and games design in the transition from desktop to full body interactive games.
Advances in Human-Computer Interaction, 2016
As gestural interfaces emerged as a new type of user interface, their use has been vastly explored by the entertainment industry to better immerse the player in games. Despite being mainly used in dance and sports games, little use was made of gestural interaction in more slow-paced genres, such as board games. In this work, we present a Kinect-based gestural interface for an online and multiplayer chess game and describe a case study with users with different playing skill levels. Comparing the mouse/keyboard interaction with the gesture-based interaction, the results of the activity were synthesized into lessons learned regarding general usability and design of game control mechanisms. These results could be applied to slow-paced board games like chess. Our findings indicate that gestural interfaces may not be suitable for competitive chess matches, yet it can be fun to play while using them in casual matches.
International Journal of Distributed Sensor Networks, 2015
This paper investigates the capability of Kinect sensor as interactive technology and discusses how it can assist and improve teaching and learning. The Kinect sensor is a motion sensor that provides a natural user interface. It was implemented by Microsoft for the Xbox 360 video-game console to create a new control-free experience for the user without any intermediary device. The Kinect sensor can enhance kinesthetic pedagogical practices to profit learners with strong bodily-kinesthetic intelligence (body smart). As a learning tool, the Kinect sensor has potential to create interactive games, to increase learner motivation, and to enhance learning efficiency via its multimedia and multisensory capacity. Many students must learn spatial skills to improve learning achievement in science, mathematics, and engineering. This paper will focus on developing the Kinect sensor-assisted game-based learning system with ARCS model to provide kinesthetic pedagogical practices for learning spat...
Journal of Motor Learning and Development, 2014
This study sought to explore the type of fundamental movement skills (FMS) performed during Active Video Game (AVG) play, as well as the frequency with which these FMS are performed. In addition, this study aimed to determine the relationship between FMS performance and energy expenditure during 15 min of AVG play across two Microsoft Xbox Kinect AVGs. Fundamental movement skills were observed via video by two raters and energy expenditure derived using Actiheart monitors in children aged 10-15 years. Six different FMS were observed during AVG play with differences in the number of FMS performed between the two AVGs. The overall energy expended (Joules/kg/minute), however, was similar between the AVGs, suggesting the frequency of FMS did not influence overall energy expended during play. The movements observed during AVG play that possibly accounted for the energy expenditure, were not of a quality that could be classified as FMS. This research demonstrates that children playing these two games have the opportunity to repeatedly perform mostly two FMS, namely jumping and dodging. The goal of the AVGs, however, could be achieved with generalized movements that did not always meet the criteria to be classified as a FMS.
2019
As technological innovation is fused into the rehabilitation process, it gives conventional therapy a new direction with the products of interactive nature and easy to measure techniques. In the recent years, virtual reality based game therapy has turned out to be a promising option for post-stroke patients since it engages patients with fun based exercises during rehabilitation process. It also triggers their neuro-motor functions and accelerates the recovery process. Nevertheless it is necessary to extract some valuable information from the joint movements to measure the recovery condition of patients. Most of the designed games have introduced features to make them interesting as well as challenging for patients, however, only a few measure the joint parameters. We have designed a Kinect based game in Unity3D platform where patients can play game by moving their joints which results in different orthopaedic lessons required for rehabilitation therapy. In contrast to many Kinect based games where only joint movements are considered for playing the game, we have also introduced voice control through speech recognition and feedback provided in terms of audiovisual command to enhance patient's engagement. Different joint parameters such as trajectory, range of motion, joint velocity, acceleration, reaching time and joint torque are also measured to help quantify the heath condition.
Proceedings of the 2019 ACM International Conference on Interactive Surfaces and Spaces
Gesture recognition devices provide a new means for natural human-computer interaction. However, when selecting these devices for games, designers might find it challenging to decide which gesture recognition device will work best. In the present research, we compare three vision-based, hand gesture devices: Leap Motion, Microsoft's Kinect, and Intel's RealSense. We developed a simple hand-gesture based game to evaluate performance, cognitive demand, comfort, and player experience of using these gesture devices. We found that participants' preferred and performed much better using Leap Motion and Kinect compared to using Re-alSense. Leap Motion also outperformed or was equivalent to Kinect. These findings suggest that not all gesture recognition devices can be suitable for games and that designers need to make better decisions when selecting gesture recognition devices and designing gesture based games to insure the usability, accuracy, and comfort of such games.

Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
References (4)
- M. Renzi, S. Vassos, T. Catarci, and S. Kimani, "Touching Notes: A Gesture-Based Game for Teaching Music to Childer," in proceedings of TEI 2015 (9 th International Conference on Tangible, Embedded and Embodied Interaction), Stanford, CA, USA, pp. 603-606, January 15-19, 2015.
- C.-H. Tsai, Y.-H. Kuo, K.-C. Chu, and J.-C. Yen, "Development and Evaluation of Game-based Learning System Using the Microsoft Kinect Sensor", International Journal of Distributed Sensor Network, in press.
- B.Lange, C.-Y. Chang, E. Suma, B. Newman, A.S. Rizzo, and M. Bolas, "Devlopement and evaluation of low cost game-based balance rehabilitation tool using the microsoft kinect sensor", In proceedings of EMBC 2011 (IEEE Engineering in Medicine and Biology Society), pp. 1831-1834, August 30- September 3, 2011.
- B. Lange, S. Koening, E. McConnell, C. Chang, R. Juang, E. Suma, M. Bolas, and A. Rizzo, "Interative game-based rehabilitation using Microsoft Kinect", In proceedings of VRW 2012 (IEEE Virtual Reality Short Papers and Posters), pp. 171-172, March 4-8, 2012.