Escolar Documentos
Profissional Documentos
Cultura Documentos
org
2324
ISSN- 2277-1956
, Ruchika
Department of Electronics and Communication Engineering Department of Electronics and Communication Engineering, I.E.T, Bhaddal, INDIA 2 Department of Electronics and Communication Engineering, I.E.T, Bhaddal, INDIA 1 malik.aarti185@gmail.com, ruchi_khrn@yahoo.co.in
Abstract- Body language comprises of two main parts- postures and gestures where the literal meaning of posture is static positions or poses and gestures is dynamic hand or body signs. The traditional input devices have limitations especially for those users who are uncomfortable with them. So a way out of this problem is gesture based interaction system. The main goal of gesture recognition is to create a human interaction based system which can recognize specific human gestures and use them to convey information or for device control. This paper presents a review of the introduction, types of gestures and identifies trends in technology, application and usability. Keywords Gesture based interaction system, User interfaces, Hand gestures, Applications
I. INTRODUCTION For centuries keyboards and mouse have been the main input devices but now with the popularity of ubiquitous and ambient devices like digital TV, play stations, options like grasping virtual objects, hand, head or body gesture, eye fixation tracking are becoming the need of time. Gestures have always being used in a human interaction like waving goodbye is a gesture but now with the advent of human computer interaction system their utility has increased. It provides easy means to interact with the surrounding environment especially for handicapped people who are unable to live their lives in a traditional way. A gesture is scientifically categorized into two distinctive categories: dynamic and static. A dynamic gesture is intended to change over a period of time whereas a static gesture is observed at the spurt of time. A waving hand means goodbye is an example of dynamic gesture and the stop sign is an example of static gesture. To understand a full message, it is necessary to interpret all the static and dynamic gestures over a period of time. This complex process is called gesture recognition. Gesture recognition is the process of recognizing and interpreting a stream continuous sequential gesture from the given set of input data like movements of the hands, arms, face, or sometimes head. Without any mechanical help elderly and disable people can also use gesture based system for controlling the PC. Different technologies like camera, graphics, vision etc. are used in Gesture controlled system. II.TYPES OF GESTURES According to Cadoz [1] gesture have three complementary and interdependent functions: 1. The epistemic function, which corresponds to perception. This includes: - The haptic sense, which combines tactile (touch) and kinesthetic sensations (awareness of the position ofthe body and limbs), and gives information about size, shape and orientation. - The proprioceptive sense, which provides informationon weight and movement through joint sensors. 2. The ergotic function, which corresponds to actions applied to objects. 3. The semiotic function, which concerns communication. Examples include sign language and gesture accompanying speech. Gestures are often language and culture-specific. They canbroadly be of the following types [2]: 1. Hand and arm gestures: recognition of hand poses, sign languages, and entertainment applications (allowing children to play and interact in virtual environments).
ISSN 2277-1956/V1N4-2324-2327
2325 Gesture Technology: A Review 2. Head and face gestures: Some examples are a) nodding or head shaking, b) direction of eye gaze, c) raising the eyebrows, d) opening and closing the mouth, e) winking, f) flaring the nostrils, e) looks of surprise, happiness, disgust, fear, sadness, and many others represent head and face gestures. Body gestures: involvement of full body motion, as in a) tracking movements of two people having a conversation, b) analyzing movements of a dancer against the music being played and the rhythm, c) recognizing human gaits for medical rehabilitation and athletic training.
3.
ISSN 2277-1956/V1N4-2324-2327
2326
Elderly-Hand
Intelligent Smart Home Control Using Body Gestures[10] -2006 Head gesture recognition [11] 2007 Select-and-Point[12] -2008
General
Smart home with 3 CCD cameras. Marker attached in the human body Wheelchair with laptop & webcam.
General
General-Hand
Composed of three parts-a presence server, controlling peer & controlled peer using cameral, software tools user-defined gesture set, implications for surface technology, and a taxonomy of surface gestures.
It results include a gesture taxonomy, the userdefined gesture set, performance measures, subjective responses, and qualitative observations.
Designing for
gestures
design patterns
(multi-touch)
screens -2011
Desktop operating systems, mobile operating systems, 3rd Party software, small software products, and common hardware products.
It discusses the evolution of interface design from a hardware driven to a software driven approach
V.CONCLUSION In this paper, we described introduction, gestures types and different methods to acquire different types of gestures. A primary goal of virtual environments is to provide natural, efficient, and flexible interaction between human and computer. Gestures as input modality can help meet these requirements. Human gestures being natural are efficient and powerful and therefore, through this paper we discuss this powerful medium.
ISSN 2277-1956/V1N4-2324-2327
VI. REFERENCE [1] Cadoz, C., Le Geste Canal de Communication HommeMachine, la Communication Instrumentale, Technique et Science Informatiques, 13, 1, 1994, pp 31-61. [2] S. Mitra and T. Acharya, Gesture recognition: A survey, IEEE Transactions on Systems, Man, and Cybernetics - Part C, vol. 37, no. 3, pp. 311324, 2007. [3] Chih-Ming Fu et.al, Hand gesture recognition using a real-time tracking method and hidden Markov models, Science Direct Image and Vision Computing, Volume 21, Issue 8, 1 August 2003, pp.745-758 [4] Harshith.C1, Karthik.R.Shastry2, Manoj Ravindran3, M.V.V.N.S Srikanth4, Naveen Lakshmikhanth5: survey on various gesture recognition techniques for interfacing machines based on ambient intelligence, (IJCSES) Vol.1, No.2, November 2010 [5] Vadakkepat, P et.al, Multimodal Approach to Human-Face Detection and Tracking, Industrial Electronics, IEEE Transactions on Issue Date: March 2008, Volume: 55 Issue:3, pp.1385 - 1393 [6] E. Kollorz, J. Hornegger and A. Barke, Gesture recognition with a time-of-flight camera, Dynamic 3D Imaging, International Journal of Intelligent Systems Technologies and Applications Issue: Volume 5, Number 3-4 2008, pp.334 343. [7] Bohme, M., Haker, M., Martinetz, T., and Barth, E. (2007) A facial feature tracker for humancomputer interaction based on 3D TOF cameras, Dynamic 3D Imaging (Workshop in conjunction with DAGM 2007). [8] Malik, S. and Laszlo, J. (2004). Visual Touchpad: A Two-handed Gestural Input Device. In Proceedings of the ACM International Conference on Multimodal Interfaces. p. 289 [9] Sanna K., Juha K., Jani M. and Johan M (2006), Visualization of Hand Gestures for Pervasive Computing Environments, in the Proceedings of the working conference on Advanced visual interfaces, ACM, Italy, p. 480-483. [10] Kim, D, Kim, D, (2006), An Intelligent Smart Home Control Using Body Gestures. In the Proceedings of International Conference on Hybrid Information Technology (ICHIT'06), IEEE, Korea [11] Jia, P. and Huosheng H. Hu. (2007), Head gesture recognition for hands-free control of an intelligent wheelchair. Industrial Robot: An International Journal,Emerald, p60-68. [12] Hyunglae Lee, Heeseok Jeong, JoongHo Lee, Ki-Won Yeom, HyunJin Shin, Ji-Hyung Park, "Select-and-Point: A Novel Interface for Multi- Device Connection and Control based on Simple Hand Gestures", CHI 2008, April 510, 2008, Florence, Italy, ACM 978-1-60558-012-8/08/04 [13] Jacob O. Wobbrock, Meredith Ringel Morris, Andrew D. Wilson, User-Defined Gestures for Surface Computing
ISSN 2277-1956/V1N4-2324-2327