Self-Adjusting Rear-View Mirror
2014
Sign up for access to the world's latest research
Abstract
AI
AI
The project aimed to develop a self-adjusting rear-view mirror system utilizing Microsoft Kinect for tracking head movements and an Arduino for control. The system intended to automate mirror adjustments based on the operator's head position, but faced challenges in programming compatibility between the Kinect and the Arduino. Despite not achieving full functionality, the experience provided valuable insights into Kinect programming and potential future implementations.
Related papers
Purpose:With the advent of modern technologies, Information management is evolving and various smart tools and techniques are used for runtime decision making. In this paper we are discussing the integration of kinect sensor in two different information management systems.
2014
Abstract: The purpose of this work is to develop a man-machine interface for the UC-win/Road VR software to improve the interaction between the real and virtual worlds. The Kinect plugin interface presented in this paper is based on 3D depth sensor integrating IR technology. The development started with the Kinect sensor, a device initially released by Microsoft as an input controller for its Xbox video game console, then moved to the Xtion Pro, a smaller similar sensor. This paper presents the Kinect Plugin as well as two applications oriented towards Robotics, to demonstrate the Kinect Plugin capabilities.
This paper deals with intuitive ways of attractive Human Interaction Mirror (HIM) by using the Microsoft Kinect sensor. Our work is mainly based on the extraction of human body by video stream and makes the user interaction. The fusion of user's body motion and 3D cloth model is virtually displayed in our HIM-Mirror. The virtual image is rated by hybridization of skeletal tracking algorithm and PCA based face recognition algorithm. The perfect match of 3D cloth to the superimposed image is done by Skincolor detection and the clothes are adapted to the body of the user in front of the interactive mirror. Kinect SDK is used for various fundamental function and for tracking process and the entire application is developed in .NET framework.
Proc. on Conference Robotics in Education 2011, 2011
This paper deals with education in the field of robot sensing abilities. It briefly introduces the commonly known and used concepts and sensors, but focuses mainly on the recent Kinect sensor. Technical information and background on the Kinect are provided. The last part of the article deals with possible applications of the sensor in various robotic fields with emphasis on the educational process.
Software: Practice and Experience, 2018
Universal Kinect-type-controller by ICE Lab (UKI, pronounced as “You-key”) is developed for allowing users to control any existing applications by using body motions as inputs. The middleware works by converting detected motions into keyboard and/or mouse-click events, and sending them to a target application. This paper presents the structures and designs of core modules, along with examples from real cases to illustrate how the middleware can be configured to fit a variety of applications. We present our designs of interfaces that decode all configuration details into a human-interpretable language, and these interfaces significantly promote user experience and eliminate user divide in, for example, programming skill. The performance of the middleware is evaluated on fighting-game motion data, and we make these data publicly available so that they can be used in other researches. UKI welcomes its use to everyone in unlimited purpose; for instance, it can be used to promote healthy life through a means of gaming and/or used to conduct serious researches on motion systems. The middleware serves as a shortcut in the development of motion applications – coding of an application to detect motions can be replaced with simple clicks on UKI.
Research Journal of Applied Sciences, Engineering and Technology, 2014
In the presented work, a remote robot control system is implemented utilizes Kinect based gesture recognition as human-robot interface. The movement of the human arm in 3 d space is captured, processed and replicated by the robotic arm. The joint angles are transmitted to the Arduino microcontroller. Arduino receives the joint angles and controls the robot arm. In investigation the accuracy of control by human's hand motion was tested.
Sensors, 2021
This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY
Robotics and Computer-Integrated Manufacturing, 2017
In this paper an application for the Kinect V2 sensor is described in the robotic field. The sensor is used as a vision device for detecting position, shape and dimensions of an object on the working space of a robot in order to planning the end effector path. The algorithms used for the recognition of contour and spatial position of planar shapes are described. The technique adopted for the recognition of 3D objects are presented. The first results provided by a prototype of gluing robot for the bonding of leather patches and shoe soles are presented. The results confirm the possibility of using the Kinect V2 sensor as an alternative to the well consolidated 3D measuring devices which are definitely more accurate, but also much more expensive.
During the last three years after the launch of the Microsoft Kinect R in the end-consumer market we have become witnesses of a small revolution in computer vision research towards the use of a standardized consumer-grade RGBD sensor for scene content retrieval. Beside classical localization and motion capturing tasks the Kinect has successfully been employed for the reconstruction of opaque and transparent objects. This report gives a comprehensive overview over the main publications using the Microsoft Kinect out of its original context as a decision-forest based motion-capturing tool.
Agricultural Information Research, 2013
Japanese farmers are aging and agricultural robots of the cooperative working type, which perform tasks together with humans, are required in particular. Additionally, in the case of the cooperative working type, compatibility with humans is important and the functions that can be controlled intuitively even by new agriculture workers and elderly people are desired. The function is Kansei communication, and we proposed agricultural robots, such as the Kansei Agri-robot ,which is equipped with the function, and the Chinou robot, which extracts tacit knowledge. We have been studying and developing them. In this paper, we built and evaluated an intuitive control part using motion, which is one of the core techniques. We built the system using the Kinect sensor, which can trace the skeleton information of a human. The Kinect sensor is a gaming device for the Xbox 360 and was released by Microsoft Corporation in 2010. It consists of an infrared light for distance sensor, video sensor, distance sensor and multiarray microphone. The target motion control was the "finger pointing" motion to provide the robot a working area or location for movement. As for the development environment, we used Windows 7 as the OS, OpenNI as the library, and NITE as the middleware, and we also used Visual Studio 2010, C++ language, for software development. The results are as follows: First, the skeleton information of a farmer could be extracted from various angles using the Kinect sensor. Next, an algorithm to calculate "finger pointing" points from the information of the joint coordinates of the shoulders, hands, and feet could be built. According to the verification experiment, the accuracy was high when compared to the assumed robot size and working area, and the control of a robot by hand pointing became possible. The estimation errors vary depending on the sensing angle of the robot toward the farmer, and the errors of sensing from behind the farmer were greater than those from other angles. It was also found that the Kinect sensor can be used even in the field in early morning and after late afternoon when the light intensity decreased and under artificial lighting.

Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.