Low Cost Open Source Modal Virtual Environment Interfaces Using Full Body Motion Tracking and Hand Gesture Recognition

Low Cost Open Source Modal Virtual Environment Interfaces Using Full Body Motion Tracking and Hand Gesture Recognition
Author: Matthew J. Marangoni
Publisher:
Total Pages: 59
Release: 2013
Genre: Shared virtual environments
ISBN:

Download Low Cost Open Source Modal Virtual Environment Interfaces Using Full Body Motion Tracking and Hand Gesture Recognition Book in PDF, Epub and Kindle

Virtual environments provide insightful and meaningful ways to explore data sets through immersive experiences. One of the ways immersion is achieved is through natural interaction methods instead of only a keyboard and mouse. Intuitive tracking systems for natural interfaces suitable for such environments are often expensive. Recently however, devices such as gesture tracking gloves and skeletal tracking systems have emerged in the consumer market. This project integrates gestural interfaces into an open source virtual reality toolkit using consumer grade input devices and generates a set of tools to enable multimodal gestural interface creation. The AnthroTronix AcceleGlove is used to augment body tracking data from a Microsoft Kinect with fine grained hand gesture data. The tools are found to be useful as a sample gestural interface is implemented using them. The project concludes by suggesting studies targeting gestural interfaces using such devices as well as other areas for further research.

Virtual Environments and Advanced Interface Design

Virtual Environments and Advanced Interface Design
Author: Woodrow Barfield
Publisher: Oxford University Press
Total Pages: 595
Release: 1995-06-01
Genre: Computers
ISBN: 0195360532

Download Virtual Environments and Advanced Interface Design Book in PDF, Epub and Kindle

This sweeping introduction to the science of virtual environment technology masterfully integrates research and practical applications culled from a range of disciplines, including psychology, engineering, and computer science. With contributions from the field's foremost researchers and theorists, the book focuses in particular on how virtual technology and interface design can better accommodate human cognitive, motor, and perceptual capabilities. Throughout, it brings the reader up-to-date with the latest design strategies and cutting-edge virtual environments, and points to promising avenues for future development. The book is divided into three parts. The first part introduces the reader to the subject by defining basic terms, identifying key components of the virtual environment, and reviewing the origins and elements of virtual environments. The second part focuses of current technologies used to present visual, auditory, tactile, and kinesthetic information. The book concludes with an in-depth analysis of how environments and human perception are integrated to create effective virtual systems. Comprehensive and splendidly written, Virtual Environments and Advanced Interface Design will be the "bible" on the subject for years to come. Students and researchers in computer science, psychology, and cognitive science will all want to have a copy on their shelves.

Economical and User-Friendly Design of Vision-Based Natural-User Interface Via Dynamic Hand Gestures

Economical and User-Friendly Design of Vision-Based Natural-User Interface Via Dynamic Hand Gestures
Author: Richa Golash
Publisher:
Total Pages: 11
Release: 2020
Genre:
ISBN:

Download Economical and User-Friendly Design of Vision-Based Natural-User Interface Via Dynamic Hand Gestures Book in PDF, Epub and Kindle

A vision-based hand gesture recognition technology can bring a revolutionary and beneficial change in the field of human-machine interaction for elderly and special people living at home. Nevertheless, continuous detection and localization of the moving hand region in true-color images are strenuous tasks because the hand is a non-rigid object and occupies a small area in the whole frame. True-color images are also sensitive to light variation, camera-view, and background conditions. To ease the process of hand detection and tracking, researchers prefer advanced cameras equipped with costly sensors. This increases the overall cost of interfaces and also requires technical knowledge to operate them. The second issue in dynamic hand gesture recognition is the unpredictability of hand pose used by the user and the random behavior of the hand movement while performing the hand gesture. Therefore, the use of dynamic hand gestures in vision-based human-machine interaction is limited. The goal of this paper is to propose an economical, and user-friendly, technique for vision-based human-machine interaction via dynamic hand gestures that is user-friendly and affordable by all. The proposed technique can be integrated with any day-to-day machines, for example, washing machines, radio, fans, automated doors, etc.

Real-time Immersive Human-computer Interaction Based on Tracking and Recognition of Dynamic Hand Gestures

Real-time Immersive Human-computer Interaction Based on Tracking and Recognition of Dynamic Hand Gestures
Author: Gan Lu
Publisher:
Total Pages:
Release: 2011
Genre:
ISBN:

Download Real-time Immersive Human-computer Interaction Based on Tracking and Recognition of Dynamic Hand Gestures Book in PDF, Epub and Kindle

With fast developing and ever growing use of computer based technologies, human-computer interaction (HCI) plays an increasingly pivotal role. In virtual reality (VR), HCI technologies provide not only a better understanding of three-dimensional shapes and spaces, but also sensory immersion and physical interaction. With the hand based HCI being a key HCI modality for object manipulation and gesture based communication, challenges are presented to provide users a natural, intuitive, effortless, precise, and real-time method for HCI based on dynamic hand gestures, due to the complexity of hand postures formed by multiple joints with high degrees-of-freedom, the speed of hand movements with highly variable trajectories and rapid direction changes, and the precision required for interaction between hands and objects in the virtual world. Presented in this thesis is the design and development of a novel real-time HCI system based on a unique combination of a pair of data gloves based on fibre-optic curvature sensors to acquire finger joint angles, a hybrid tracking system based on inertia and ultrasound to capture hand position and orientation, and a stereoscopic display system to provide an immersive visual feedback. The potential and effectiveness of the proposed system is demonstrated through a number of applications, namely, hand gesture based virtual object manipulation and visualisation, hand gesture based direct sign writing, and hand gesture based finger spelling. For virtual object manipulation and visualisation, the system is shown to allow a user to select, translate, rotate, scale, release and visualise virtual objects (presented using graphics and volume data) in three-dimensional space using natural hand gestures in real-time. For direct sign writing, the system is shown to be able to display immediately the corresponding SignWriting symbols signed by a user using three different signing sequences and a range of complex hand gestures, which consist of various combinations of hand postures (with each finger open, half-bent, closed, adduction and abduction), eight hand orientations in horizontal/vertical plans, three palm facing directions, and various hand movements (which can have eight directions in horizontal/vertical plans, and can be repetitive, straight/curve, clockwise/anti-clockwise). The development includes a special visual interface to give not only a stereoscopic view of hand gestures and movements, but also a structured visual feedback for each stage of the signing sequence. An excellent basis is therefore formed to develop a full HCI based on all human gestures by integrating the proposed system with facial expression and body posture recognition methods. Furthermore, for finger spelling, the system is shown to be able to recognise five vowels signed by two hands using the British Sign Language in real-time.

Real-time Vision-based Hand Tracking and Gesture Recognition

Real-time Vision-based Hand Tracking and Gesture Recognition
Author: Qing Chen
Publisher:
Total Pages: 214
Release: 2008
Genre: Computer vision
ISBN:

Download Real-time Vision-based Hand Tracking and Gesture Recognition Book in PDF, Epub and Kindle

An application of gesture-based interaction with a 3D gaming virtual environment is implemented. With this system, the user can navigate the 3D gaming world by driving the avatar car with a set of hand postures. When the user wants to manipulate the virtual objects, he can use a set of hand gestures to select the target traffic sign and open a window to check the information of the correspondent learning object. This application demonstrates the gesture-based interface can achieve an improved interaction, which are more intuitive and flexible for the user.

Statistical Hand Gesture Recognition System Using the Leap Motion Controller

Statistical Hand Gesture Recognition System Using the Leap Motion Controller
Author: Michael Dimartino
Publisher:
Total Pages: 44
Release: 2016
Genre:
ISBN:

Download Statistical Hand Gesture Recognition System Using the Leap Motion Controller Book in PDF, Epub and Kindle

As technology continues to improve, hand gesture recognition as a form of humancomputer interaction is becoming more and more feasible. One such piece of technology, the Leap Motion Controller, provides 3D tracking data of the hands through an easy-to-use API. This thesis presents an application that uses Leap Motion tracking data to learn and recognize static and dynamic hand gestures. Gestures are recognized using statistical pattern recognition. Each gesture is defined by a set of features including fingertip positions, hand orientation, and movement. Given sufficient training data, the similarity between two gestures is measured by comparing each of their corresponding features. Two separate implementations are presented for dealing with the temporal aspect of dynamic gestures. Users are able to interact with the system using a convenient graphical user interface. The accuracy of the system was experimentally tested with the help of two separate test participants: one for the training phase and one for the recognition phase. All test gestures (both static and dynamic) were successfully recognized with minimal training data. In some cases, additional gestures were mistakenly recognized.

Immersive Articulation of the Human Upper Body in a Virtual Environment

Immersive Articulation of the Human Upper Body in a Virtual Environment
Author: Paul F. Skopowski
Publisher:
Total Pages: 224
Release: 1996
Genre: Kinematics
ISBN:

Download Immersive Articulation of the Human Upper Body in a Virtual Environment Book in PDF, Epub and Kindle

This thesis addresses the problem that virtual environments (VE's) do not possess a practical, intuitive, and comfortable interface that allows a user to control a virtual human's movements in real-time. Such a device would give the user the feeling of being immersed in the virtual world, greatly expanding the usability of today's virtual environments. The approach was to develop an interface for the upper body, since it is through this part of users' anatomy that they interact most with their environment. Lower body motion can be more easily scripted. Implementation includes construction of a kinematic model of the upper body. The model is then manipulated in real-time with inputs from electromagnetic motion tracking sensors places on the user. Research resulted in an interface that is easy to use and allows its user limited interaction with a VE. The device takes approximately one sixth the time to don and calibrate as do mechanical interfaces with similar capability. It tracks thirteen degrees of freedom. Upper body position is tracked, allowing the users to move through the VE. Users can orient their upper body and control the movements of one arm. Uncorrected position data from two trackers was used to generate clavicle joint angles. Difficulty in controlling figure motion indicates that the sensors used lack sufficient registration for this purpose. Therefore, the interface software uses only orientation data for computing joint angles.

A Dynamic Gesture Interface

A Dynamic Gesture Interface
Author: Thierry Métais
Publisher:
Total Pages: 236
Release: 2005
Genre: University of Ottawa theses
ISBN:

Download A Dynamic Gesture Interface Book in PDF, Epub and Kindle