Você está na página 1de 4

International Journal of Electronics and Computer Science Engineering Available Online at www.ijecse.

org

2324

ISSN- 2277-1956

Gesture Technology: A Review


Aarti Malik
1

, Ruchika

Department of Electronics and Communication Engineering Department of Electronics and Communication Engineering, I.E.T, Bhaddal, INDIA 2 Department of Electronics and Communication Engineering, I.E.T, Bhaddal, INDIA 1 malik.aarti185@gmail.com, ruchi_khrn@yahoo.co.in
Abstract- Body language comprises of two main parts- postures and gestures where the literal meaning of posture is static positions or poses and gestures is dynamic hand or body signs. The traditional input devices have limitations especially for those users who are uncomfortable with them. So a way out of this problem is gesture based interaction system. The main goal of gesture recognition is to create a human interaction based system which can recognize specific human gestures and use them to convey information or for device control. This paper presents a review of the introduction, types of gestures and identifies trends in technology, application and usability. Keywords Gesture based interaction system, User interfaces, Hand gestures, Applications

I. INTRODUCTION For centuries keyboards and mouse have been the main input devices but now with the popularity of ubiquitous and ambient devices like digital TV, play stations, options like grasping virtual objects, hand, head or body gesture, eye fixation tracking are becoming the need of time. Gestures have always being used in a human interaction like waving goodbye is a gesture but now with the advent of human computer interaction system their utility has increased. It provides easy means to interact with the surrounding environment especially for handicapped people who are unable to live their lives in a traditional way. A gesture is scientifically categorized into two distinctive categories: dynamic and static. A dynamic gesture is intended to change over a period of time whereas a static gesture is observed at the spurt of time. A waving hand means goodbye is an example of dynamic gesture and the stop sign is an example of static gesture. To understand a full message, it is necessary to interpret all the static and dynamic gestures over a period of time. This complex process is called gesture recognition. Gesture recognition is the process of recognizing and interpreting a stream continuous sequential gesture from the given set of input data like movements of the hands, arms, face, or sometimes head. Without any mechanical help elderly and disable people can also use gesture based system for controlling the PC. Different technologies like camera, graphics, vision etc. are used in Gesture controlled system. II.TYPES OF GESTURES According to Cadoz [1] gesture have three complementary and interdependent functions: 1. The epistemic function, which corresponds to perception. This includes: - The haptic sense, which combines tactile (touch) and kinesthetic sensations (awareness of the position ofthe body and limbs), and gives information about size, shape and orientation. - The proprioceptive sense, which provides informationon weight and movement through joint sensors. 2. The ergotic function, which corresponds to actions applied to objects. 3. The semiotic function, which concerns communication. Examples include sign language and gesture accompanying speech. Gestures are often language and culture-specific. They canbroadly be of the following types [2]: 1. Hand and arm gestures: recognition of hand poses, sign languages, and entertainment applications (allowing children to play and interact in virtual environments).

ISSN 2277-1956/V1N4-2324-2327

2325 Gesture Technology: A Review 2. Head and face gestures: Some examples are a) nodding or head shaking, b) direction of eye gaze, c) raising the eyebrows, d) opening and closing the mouth, e) winking, f) flaring the nostrils, e) looks of surprise, happiness, disgust, fear, sadness, and many others represent head and face gestures. Body gestures: involvement of full body motion, as in a) tracking movements of two people having a conversation, b) analyzing movements of a dancer against the music being played and the rhythm, c) recognizing human gaits for medical rehabilitation and athletic training.

3.

III. GESTURE ACQUISITION TECHNIQUES

A. Hidden Markov Models


The dynamic aspects of gestures are utilized in this method [3]. Recognition of two classes of Gestures: deictic and symbolic is the main goal. 1. In this technique gesture are extracted from a sequence of video images by tracking the skin-color blobs corresponding to the hand into a body face space centered on the face of the user. 2. Image filtration is done using a fast lookup indexing table of skin colour pixels in YUV color space. 3. After filtering, skin colour pixels are gathered into blobs. Blobs are statistical objects based on the location (x,y) and the colourimetry (Y,U,V) of the skin colour pixels in order to determine homogeneous areas. 4. A skin colour pixel belongs to the blob which has the same location and colourimetry component. Deictic gestures are pointing movements towards the left (right) of the bodyface space and symbolic gestures are intended to execute commands (grasp, click, rotate) on the left (right) of shoulder.

B. YUV Color Space and CAMSHIFT Algorithm [4]


This method deals with recognition of hand gestures. It is done in the following five steps. 1. First, a digital camera records a video stream of hand gestures. 2. All the frames are taken into consideration and then using YUV colour space skin color based segmentation is performed. The YUV colour system is employed for separating chrominance and intensity. The symbol Y indicates intensity while UV specifies chrominance components. 3. Now the hand is separated using CAMSHIFT [5] algorithm .Since the hand is the largest connected region, we can segment the hand from the body. 4. After this is done, the position of the hand centroid is calculated in each frame. This is done by first calculating the zeroth and first moments and then using this information the centroid is calculated. 5. Now the different centroid points are joint to form a trajectory .This trajectory shows the path of the hand movement and thus the hand tracking procedure is determined.

C. Using Time Flight Camera


This approach uses x and y-projections of the image and optional depth features for gesture classification. The system uses a 3-D time-of-flight (TOF) [6] [7] sensor which has the big advantage of simplifying hand segmentation. The gestures used in the system show a good separation potential along the two image axes. The algorithm can be divided into five steps: 1. Segmentation of the hand and arm via distance values: The hand and arm are segmented by an iterative seed fill algorithm. 2. Determination of the bounding box: The segmented region is projected onto the x- and y-axis to determine the bounding box of the object. 3. Extraction of the hand. 4. Projection of the hand region onto the x- and y-axis.

ISSN 2277-1956/V1N4-2324-2327

IJECSE,Volume1,Number 4 Aarti & Ruchika


IV.MILESTONES IN GESTURE TECHNOLOGY
Research Study-year Visual Touchpad [8]2004 Users-Gesture General-Hand Technology Quadrangle panel with a rigid backing with PCs & two cameras. Visualization method architecture using the accelerometer data Applications Interaction with PCs using touchpad. Gesture visualization method, animation of hand movement performed during the gesture control control of smart home environments such as lights and curtains Hands free control system of an intelligent wheelchair Enables controls to applications such as MS Office, web browser & multimedia programs in multiple devices. gesture classifications have been developed for human discursive gesture multimodal gestures with speech An industrial design perspective on pointing devices as an input channel. Result /Conclusion Vision-based input device that allows for fluid twohanded gestural interactions

2326

Visualization method [9]-2006

Elderly-Hand

Visualization provides information about the gesture performed.

Intelligent Smart Home Control Using Body Gestures[10] -2006 Head gesture recognition [11] 2007 Select-and-Point[12] -2008

General

Elderly & disabled

Smart home with 3 CCD cameras. Marker attached in the human body Wheelchair with laptop & webcam.

Recognition rate is 95.42% for continuously changing gestures

Head gesture control system.

General

User-Defined Gestures for Surface Computing [13] -2009

General-Hand

Composed of three parts-a presence server, controlling peer & controlled peer using cameral, software tools user-defined gesture set, implications for surface technology, and a taxonomy of surface gestures.

Implementing intelligent meeting room

It results include a gesture taxonomy, the userdefined gesture set, performance measures, subjective responses, and qualitative observations.

Designing for

gestures

design patterns

(multi-touch)

screens -2011

Desktop operating systems, mobile operating systems, 3rd Party software, small software products, and common hardware products.

It discusses the evolution of interface design from a hardware driven to a software driven approach

V.CONCLUSION In this paper, we described introduction, gestures types and different methods to acquire different types of gestures. A primary goal of virtual environments is to provide natural, efficient, and flexible interaction between human and computer. Gestures as input modality can help meet these requirements. Human gestures being natural are efficient and powerful and therefore, through this paper we discuss this powerful medium.

ISSN 2277-1956/V1N4-2324-2327

2327 Gesture Technology: A Review

VI. REFERENCE [1] Cadoz, C., Le Geste Canal de Communication HommeMachine, la Communication Instrumentale, Technique et Science Informatiques, 13, 1, 1994, pp 31-61. [2] S. Mitra and T. Acharya, Gesture recognition: A survey, IEEE Transactions on Systems, Man, and Cybernetics - Part C, vol. 37, no. 3, pp. 311324, 2007. [3] Chih-Ming Fu et.al, Hand gesture recognition using a real-time tracking method and hidden Markov models, Science Direct Image and Vision Computing, Volume 21, Issue 8, 1 August 2003, pp.745-758 [4] Harshith.C1, Karthik.R.Shastry2, Manoj Ravindran3, M.V.V.N.S Srikanth4, Naveen Lakshmikhanth5: survey on various gesture recognition techniques for interfacing machines based on ambient intelligence, (IJCSES) Vol.1, No.2, November 2010 [5] Vadakkepat, P et.al, Multimodal Approach to Human-Face Detection and Tracking, Industrial Electronics, IEEE Transactions on Issue Date: March 2008, Volume: 55 Issue:3, pp.1385 - 1393 [6] E. Kollorz, J. Hornegger and A. Barke, Gesture recognition with a time-of-flight camera, Dynamic 3D Imaging, International Journal of Intelligent Systems Technologies and Applications Issue: Volume 5, Number 3-4 2008, pp.334 343. [7] Bohme, M., Haker, M., Martinetz, T., and Barth, E. (2007) A facial feature tracker for humancomputer interaction based on 3D TOF cameras, Dynamic 3D Imaging (Workshop in conjunction with DAGM 2007). [8] Malik, S. and Laszlo, J. (2004). Visual Touchpad: A Two-handed Gestural Input Device. In Proceedings of the ACM International Conference on Multimodal Interfaces. p. 289 [9] Sanna K., Juha K., Jani M. and Johan M (2006), Visualization of Hand Gestures for Pervasive Computing Environments, in the Proceedings of the working conference on Advanced visual interfaces, ACM, Italy, p. 480-483. [10] Kim, D, Kim, D, (2006), An Intelligent Smart Home Control Using Body Gestures. In the Proceedings of International Conference on Hybrid Information Technology (ICHIT'06), IEEE, Korea [11] Jia, P. and Huosheng H. Hu. (2007), Head gesture recognition for hands-free control of an intelligent wheelchair. Industrial Robot: An International Journal,Emerald, p60-68. [12] Hyunglae Lee, Heeseok Jeong, JoongHo Lee, Ki-Won Yeom, HyunJin Shin, Ji-Hyung Park, "Select-and-Point: A Novel Interface for Multi- Device Connection and Control based on Simple Hand Gestures", CHI 2008, April 510, 2008, Florence, Italy, ACM 978-1-60558-012-8/08/04 [13] Jacob O. Wobbrock, Meredith Ringel Morris, Andrew D. Wilson, User-Defined Gestures for Surface Computing

ISSN 2277-1956/V1N4-2324-2327

Você também pode gostar