Você está na página 1de 10

ABSTRACT: Is it possible to create a computer which can interactw i t h u s a s w e i n t e r a c t e a c h o t h e r ? F o r e x a m p l e i m a g i n e i n a f i n e m o r n i n g y o u w a l k o n t o y o u r computer room and switch on your computer, and t h e n i t t e l l s yo u H e y f r i e n d , g o o d m o r n i n g y o u s e e m t o b e a b a d m o o d t o d a y.

. A n d t h e n i t o p e n s your mailbox and shows you some of the mails andtries to cheer you. It seems to be a fiction, but it willbe the life lead by BLUE EYES in the very nearfuture. The basic idea behind this technology is togive the computer the human power. We all havesome perceptual abilities. That is we can understandeach others feelings. For example we can understando n e s e m o t i o n a l s t a t e b y a n a l y z i n g h i s f a c i a l expression. If we add these perceptual abilities ofh u m a n t o c o m p u t e r s w o u l d e n a b l e c o m p u t e r s t o w o r k t o g e t h e r w i t h h u m a n b e i n g s a s i n t i m a t e partners. The BLUE EYES technology aims atc r e a t i n g c o m p u t a t i o n a l m a c h i n e s t h a t h a v e p e r c e p t u a l a n d s e n s o r y a b i l i t y l i k e t h o s e o f humanbeingsH o w c a n w e m a k e c o m p u t e r s " s e e " a n d " f e e l " ? B l u e E ye s u s e s s e n s i n g t e c h n o l o g y t o i d e n t i f y a user's actions and to extract key information. Thisinformation is then analyzed to determine the user'sphysical, emotional, or informational state, which int u r n c a n b e u s e d t o h e l p m a k e t h e u s e r m o r e productive by per forming expected actions or byproviding expected information. For example, infu ture a Blue Eyes-enabled television could becomeactive when the user makes eye contact,at which point the user could then tellthe television to "turn on". This paper isabout the hardware, software, benefitsand interconnection of various partsinvolved in the blue eyetechnology.

INTRODUCTION I m a g i n e y o u r s e l f i n a w o r l d w h e r e humans interact with computers. Youa r e s i t t i n g i n f r o n t o f y o u r p e r s o n a l computer that can listen, talk, or evenscream aloud. It has the ability to gatherinformation about you and interact withy o u t h r o u g h s p e c i a l t e c h n i q u e s l i k e facial recognition, speech recog nition,e t c . I t c a n e v e n u n d e r s t a n d y o u r emotions at the touch of the mouse. Itv e r i f i e s y o u r i d e n t i t y , f e e l s y o u r p r e s e n t s , a n d s t a r t s i n t e r a c t i n g w i t h you .You can ask the computer to dial toyour friend at his office. It realizes theu r g e n c y o f t h e s i t u a t i o n t h r o u g h t h e mouse,dials your friend at his office, ande s t a b l i s h e s a c o n n e c t i o n . Human cognition depends primarily ont h e a b i l i t y t o p e r c e i v e , i n t e r p r e t , a n d integrate audiovisuals and censoringi n f o r m a t i o n . A d d i n g e x t r a o r d i n a r y p e r c e p t u a l a b i l i t i e s t o c o m p u t e r s w o u l d e n a b l e computers to work together with human beings asintimate partners. Researchers are attempting to addmore capabilities to computers that will allow themto interact like humans, recognize human presents,talk, listen, or even guess their feelings.The BLUEEYES technology aims at creating computationalmachines that have percep tual and sensory abilitylike those of human beings. It uses nonobtrusives e n s i n g m e t h o d , e m p l o yi n g m o s t m o d e r n v i d e o c a m e r a s a n d m i c r o p h o n e s t o i d e n t i f y t h e u s e r s actions through the use of imparted sensory abilities.T h e m a c h i n e c a n u n d e r s t a n d w h a t a u s e r w a n t s , where he is looking at, and even realize his physicalo r e m o t i o n a l s t a t e s . The basic idea behind this technology is to give thec o m p u t e r t h e h u m a n p o w e r . W e a l l h a v e s o m e perceptual abilities. That is we can understand eachothers feelings. For example we can understand onesemotional state by analyzing his facial expression. Ifw e a d d t h e s e p e r c e p t u a l a b i l i t i e s o f h u m a n t o computers would enable computers to work togetherwith human beings as intimate partners. The "BLUEEYES" technology aims at creating computationalmachines that have perceptual and sensory abilitylike those of human beings. System overview B l u e E ye s s ys t e m p r o v i d e s t e c h n i c a l m e a n s f o r m o n i t o r i n g a n d r e c o r d i ng the operators basicphysiological parameters. The mosti m p o r t a n t p a r a m e t e r i s s a c c a d i c activity 1 , w h i c h e n a b l e s t h e s ys t e m t o m o n i t o r t h e s t a t u s o f t h e o p e r a t o r s v i s u a l a t t e n t i o n a l o n g w i t h h e a d acceleration, which accompanies l argedisplacement of the visual axis (saccadesl a r g e r t h a n 1 5 d e g r e e s ) . C o m p l e x i n d u s t r i a l e n v i r o n m e n t c a n c r e a t e a danger of exposing the operator to toxicsubstances, which can affect his cardiac,c i r c u l a t o r y a n d p u l m o n a r y s y s t e m s . T h u s , o n

t h e g r o u n d s o f plethysmographic signal taken from thef o r e h e a d s k i n s u r f a c e , t h e s y s t e m c o m p u t e s h e a r t b e a t r a t e a n d b l o o d oxygenation.T h e B l u e E y e s s y s t e m c h e c k s a b o v e parameters against abnormal (e.g. a lowl e v e l o f b l o o d o x y g e n a t i o n o r a h i g h pulse rate) or undesirable (e.g. a longerp e r i o d o f l o w e r e d v i s u a l a t t e n t i o n ) values and triggers user-defined alarmswhen necessary.Quite often in an emergency situationo p e r a t o r s s p e a k t o t h e m s e l v e s e x p r e s s i n g t h e i r s u r p r i s e o r s t a t i n g verbally the prob l e m . T h e r e f o r e , t h e o p e r a t o r s v o i c e , p h y s i o l o g i c a l parame ters and an overall view of theoperating room are recorded. This helpsto reconstruct the course of operatorswork and provides data for long-termanalysis. Our system consists of a mobile measuring deviceand a central analytical system. The mobile device isintegrated with Bluetooth module providing wirelessinterface between sensors worn by the operator andt h e c e n t r a l u n i t . I D c a r d s a s s i g n e d t o e a c h o f t h e operators and adequate user profiles on the centralunit side provide necessary data personalization sod i f f e r e n t p e o p l e c a n u s e a s i n g l e m o b i l e d e v i c e (called hereafter DAU Data Acquisition Unit). Theoverall system diagram is shown in Figure 1.The tasks of the mobile Data Acquisition Unit are tomaintain Bluetooth connections, to get informationf r o m t h e s e n s o r a n d s e n d i n g i t o v e r t h e w i r e l e s s connection, to deliver the alarm messages sent fromthe Central System Unit to the operator and handlep e r s o n a l i z e d I D c a r d s . C e n t r a l S y s t e m U n i t maintains the other side of the Bluetooth connection,buffers incoming sensor data, performs on-line dataa n a l y s i s , r e c o r d s t h e c o n c l u s i o n s f o r f u r t h e r exploration and provides visualization interface. Implementation and engineeringconsiderations Functional design During the functional design phase we used UMLs t a n d a r d u s e c a s e n o t a t i o n , w h i c h s h o w s t h e functions the system offers to particularusers. We defined three groups of users:o p e r a t o r s , s u p e r v i s o r s a n d s y s t e m administrators. Operator i s a p e r s o n w h o s e physiological parameters are supervised.The operator wears the DAU. The onlyf u n c t i o n s o f f e r e d t o t h a t u s e r a r e a u t h o r i z a t i o n i n t h e s y s t e m a n d r e c e i v i n g a l a r m a l e r t s . S u c h l i m i t e d functionality assures the device does notdisturb the work of theoperator (Fig. 2). Authorization t h e f u n c t i o n i s u s e d when the operators duty starts. Afterinserting his personal ID card into themobile device and entering proper PINcode the device will start listening forincoming Bluetooth connections. Oncethe connection has been established andauthorization process has succeeded (thePIN code is correct) central system startsmonitoring the operators physiologicalparameters. The authorization processshall be repeated after reinserting the IDc a r d . I t i s n o t , h o w e v e r , r e q u i r e d o n reestablishing Bluetooth connection. Receiving alerts the function supplies the operatorwith the information about the most important alertsr e g a r d i n g h i s o r h i s c o -

w o r k e r s c o n d i t i o n a n d mobile device state (e.g. connection lost, batteryl o w ) . A l a r m s a r e s i g n a l e d b y u s i n g a b e e p e r , earphone providing central system sound feedbacka n d a s m a l l a l p h a n u m e r i c L C D d i s p l a y , w h i c h shows more detailed information. Supervisor i s a p e r s o n r e s p o n s i b l e f o r a n a l yz i n g o p e r a t o r s c o n d i t i o n a n d p e r f o r m a n c e . T h e s u p e r v i s o r r e c e i v e s t o o l s f o r i n s p e c t i n g p r e s e n t values of the parameters ( On-line browsing ) as wellas browsing the results of long-term analysis ( Off-line browsing ).During the on-line browsing it is possible to watch alist of currently working operators and the status oftheir mobile devices. Selecting one of the operatorse n a b l e s t h e s u p e r v i s o r t o c h e c k t h e o p e r a t o r s c u r r e n t p h ys i o l o g i c a l c o n d i t i o n ( e . g . a p i e c h a r t showing active brain involvement) and a history ofalarms regarding the operator. All new incomingalerts are displayed immediately so thatthe supervisoris able to react fast.H o w e v e r , t h e p r e s e n c e o f t h e h u m a n s u p e r v i s o r i s n o t n e c e s s a r y s i n c e t h e s y s t e m i s e q u i p p e d w i t h r e a s o n i n g algorithms and can trigger userdefinedactions (e.g. to inform the operators co-workers).During off-line browsing it is possible toreconstruct the course of the operatorsd u t y w i t h a l l t h e p h y s i o l o g i c a l p a r a m e t e r s , a u d i o a n d v i d e o d a t a . A c o m p r e h e n s i v e d a t a a n a l ys i s c a n b e p e r f o r m e d e n a b l i n g t h e s u p e r v i s o r t o draw conclusions on operators overallperformance and competency (e.g. forworking night shifts). System administrator i s a u s e r t h a t maintains the system. The administratori s d e l i v e r e d t o o l s f o r a d d i n g n e w operators to the database, defining alarmconditions, configuring logging toolsand creating new analyzer modules.While registering new operators thea d m i n i s t r a t o r e n t e r s a p p r o p r i a t e d a t a (and a photo if available) to the systemdatabase and programs his personal IDcard. Defining alarm conditions the functionenables setting up user-defined alarmconditions by writing conditionactionrules (e.g. low saccadic activity during al o n g e r p e r i o d o f t i m e informo p e r a t o r s c o - w o r k e r s , w a k e h i m u p using the beeper or playing appropriate sound and log the event in the database). Designing new analyzer modules based on earlierr e c o r d e d d a t a t h e a d m i n i s t r a t o r c a n c r e a t e n e w analyzer modules that can recognize other behaviorsthan those which are built-in the system. The newmodules are created using decision tree inductiona l g o r i t h m . T h e a d m i n i s t r a t o r n a m e s t h e n e w behavior to be recognized and points t h e d a t a associated with it. The results received from the newmodules can be used in alarm conditions. Monitoring setup

e n a b l e s t h e a d m i n i s t r a t o r t o choose the parameters to monitor as wel l a s t h e a l g o r i t h m s o f t h e d e s i r e d a c c u r a c y t o c o m p u t e parameter values. Loggersetup providestools forselecting the parameters to be recorded. For audiodata sampling frequency can be chosen. As regardst h e v i d e o s i g n a l , a d e l a y b e t w e e n s t o r i n g consec utive frames can be set (e.g. one picture everytwo seconds). Database maintenance here the administrator canr e m o v e o l d o r u n i n t e r e s t i n g d a t a f r o m t h e database. The uninteresting data is suggested bythe built-in reasoning system. Data Acquisition Unit (DAU) In this section we describe the hardware part of theBlueEyes system with regard to the physiologicald a t a s e n s o r , t h e D A U h a r d w a r e c o m p o n e n t s a n d m i c r o c o n t r o l l e r software. P h y s i o l o g i c a l d a t a sensor To provide the Data Acquisition Unitw i t h n e c e s s a r y p h ys i o l o g i c a l d a t a w e d e c i d e d t o p u r c h a s e a n o f f s h e l f e ye movement sensor Jazz Multisensor. Itsupplies raw digital data regarding eyeposition, the level of blood oxygenation,a c c e l e r a t i o n a l o n g h o r i z o n t a l a n d vertical axes and ambient light intensity.Eye movement is measured using directinfrared oculographic transducers.The eye movement is sampled at 1kHz,t h e o t h e r p a r a m e t e r s a t 2 5 0 H z . T h e s e n s o r s e n d s a p p r o x i m a t e l y 5 , 2 k B o f data per second

Central System Unit (CSU) CSU software is located on the delivered Toshibal a p t o p , i n c a s e o f l a r g e r r e s o u r c e d e m a n d s t h e processing can be distributed among a number ofn o d e s . I n t h i s s e c t i o n w e d e s c r i b e t h e f o u r m a i n CSU modules (see Fig. 1): Connection Manager,Data Analysis, Data Logger and Visualization. Themodules exchange data using specially designedsingleproducer-multi-consumer buffered thread-safe queues. Any number of consumer modules canregister to receive the data supplied by a producer.Every single consumer can register at any number ofproducers, receiving therefore different types ofdata. Naturally, every consumer may be a producerfor other consumers (for system core class diagram see Appendix C). This approach enables high systems c a l a b i l i t y n e w d a t a p r o c e s s i n g m o d u l e s ( i . e . f i l t e r s , d a t a a n a l yz e r s a n d l o g g e r s ) c a n b e e a s i l y added by simply registering as a consumer. Connection Manager Connection Managers main task is t perform lowl e v e l B l u e t o o t h c o m m u n i c a t i o n u s i n g H o s t Controller Interf ace commands. It is designed tocooperate with all available Bluetooth devices ino r d e r t o s u p p o r t r o a m ing (for CSU detailedhardware speci fication s e e A p p e n d i x D ) . Additionally, Connection Manager authori z e s operators, manages their sessions, demultiplexes andb u f f e r s r a w p h ys i o l o g i c a l d a t a . F i g u r e 1 1 s h o w s Connection Manager architecture. Transport Layer Manager hides the details interface (which can be either RS232 orUART or USB standard) and providesuniform HCI command interface. Bluetooth Connection Manager isr e s p o n s i b l e f o r e s t a b l i s h i n g a n d m a i n t a i n i n g c o n n e c t i o n s u s i n g a l l a v a i l a b l e B l u e t o o t h d e v i c e s . I t periodically inquires new devices in anoperating range and checks whether theyare registered in the system database.Only with those devices the ConnectionM a n a g e r w i l l c o m m u n i c a t e . A f t e r e s t a b l i s h i n g a c o n n e c t i o n a n authentication procedure occurs. Theauth e n t i c a t i o n p r o c e s s i s p e r f o r m e d using system PIN code fetched from thedatabase. Once the connection has beena u t h e n t i c a t e d t h e m o b i l e u n i t s e n d s a d a t a f r a m e c o n t a i n i n g t h e o p e r atorsi d e n t i f i e r . F i n a l l y , t h e C o n n e c t i o n M a n a g e r a d d s a S C O l i n k ( v o i c e connection) and runs a new dedicatedOperator Manager, which will managethe new operators session (for detailedBluetooth communication flow chartsa n d p r o t o c o l s see Appendix E A d d i t i o n a l l y, t h e C o n n e c t i o n M a n a g e r m a p s t h e operators identifiers into the Bluetooth connections,so that when the operators roam around the covereda r e a a c o n n e c t i o n w i t h a n a p p r o p r i a t e B l u e t o o t h d e v i c e i s e s t a b l i s h e d a n d t h e d a t a s t r e a m i s redirected accordingly.The data of each supervised operator is bufferedseparately in the dedicated Operator Manager. Atthe startup it communicates with the Operator DataManager in order to get more detailed personal data.The most important Operator Managers task is tob u f f e r t h e

i n c o m i n g r a w d a t a a n d t o s p l i t i t i n t o separate data streams related to each of the measuredp a r a m e t e r s . T h e r a w d a t a i s s e n t t o a L o g g e r Module, the split data streams are available for theother system modules through producerconsumerq u e u e s . F u r t h e r m o r e , t h e O p e r a t o r M a n a g e r provides an interface for sending alert messages tothe related operator. Operator Data Manager provides an interface to theoperator database enabling the other modules to reado r w r i t e p e r s o n a l d a t a a n d s y s t e m a c c e s s information. Data Analysis Module The module performs the analysis of the raw sensord a t a i n o r d e r t o o b t a i n i n f o r m a t i o n a b o u t t h e operators p hysiological condition. The separatelyrunning Data Analysis Module supervises each oft h e w o r k i n g o p e r a t o r s . T h e m o d u l e c o n s i s t s o f a number of smaller analy zers extracting differenttypes of information. Each of the analyzers registersa t t h e a p p r o p r i a t e O p e r a t o r M a n a g e r o r a n o t h e r a n a l y z e r a s a d a t a c o n s u m e r a n d , a c t i n g a s a producer, provides the results of the analysis. Ananalyzer can be either a simple signalfilter (e.g. Finite Input Response (FIR)filter) or a generic data extractor (e.g.signal variance, saccade detector) or acustom detector module. As we are notable to predict all the supervisors needs,t h e c u s t o m m o d u l e s a r e c r e a t e d b y applying a supervised machine learninga l g o r i t h m t o a s e t o f e a r l i e r r e c o r d e d examples containing the charact eristicf e a t u r e s t o b e r e c o g n i z e d . I n t h e p r o t o t y p e w e u s e d a n i m p r o v e d C 4 . 5 decision tree induction algorithm. Thec o m p u t e d f e a t u r e s c a n b e e . g . t h e operators position (standing, walkingand lying) or whether his eyes are closedor opened.A s b u i l t - i n a n a l y z e r m o d u l e s w e implemented a saccade detector, visualattention level, blood oxygenation andpulse rate analyzers.The saccade detector registers as an eyem o v e m e n t a n d a c c e l e r o m e t e r s i g n a l v a r i a n c e d a t a c o n s u m e r a n d u s e s t h e data to signal saccade occurrence. Sincesaccades are the fastest eye movementsthe algorithm calculates eye movementv e l o c i t y a n d c h e c k s p h y s i o l o g i c a l constraints.T h e a l g o r i t h m w e u s e d p e r f o r m s t w o main steps:A. User adjustment step . T h e p h a s e takes up 5 s. After buffering approx. 5 sof the signal differentiate it using threepoint central difference algorithm, whichwill give eye velocity time series. Sort the velocities by absolute value and calculate upper15% of the border velocity along both X v 0x and Y v 0y axes . As a result v 0x and v 0y are cut-offvelocities.B. On-line analyzer flow

. Continuously calculate eyem o v e m e n t v e l o c i t y u s i n g t h r e e p o i n t c e n t r a l d i f f e r e n c e a l g o r i t h m . I f t h e v e l o c i t y e x c e s s precalculated v 0 ( b o t h a x e s a r e c o n s i d e r e d separately) there is a possibil i t y o f s a c c a d e occurrence. Check the following conditions (if anyof them is satisfied do not detect a saccade):the last saccade detection was less than 130 ms ago(physiological constraint the saccades will noto c c u r m o r e f r e q u e n t l y ) t h e m o v e m e n t i s nonlinear (physiological constraint) compare thesignal with accelerometer (rapid head movementmay force eye activity of comparable speed) ifthe accelerometer signal is enormously unevenc o n s i d e r i g n o r i n g t h e s i g n a l d u e t o p o s s i b l e sensor device movements. Data Logger Module T h e m o d u l e p r o v i d e s s u p p o r t f o r s t o r i n g t h e monitored data in order to enable the supervisor toreconstruct and analyze the course of the operatorsduty. The module registers as a consumer of the datato be stored in the database (for database schema see Appendix G). Each working operators data isrecorded by a separate instance of the Data Logger.Apart from the raw or processed physiological data,alerts and operators voice are stored.The raw data is supplied by the relatedO p e r a t o r M a n a g e r m o d u l e ( F i g . 1 1 ) , w h e r e a s t h e D a t a A n a l y s i s m o d u l e delivers the processed data. The voiced a t a i s d e l i v e r e d b y a V o i c e D a t a A c q u i s i t i o n m o d u l e . T h e m o d u l e registers as an operators vo i c e d a t a consumer and optionally processes thesound to be stored (i.e. reduces noise orr e m o v e s t h e f r a g m e n t s w h e n t h e operator does not speak). The Loggerstask is to add appropriate time stamps toe n a b l e t h e s y s t e m t o r e c o n s t r u c t t h e voice.Additionally, there is a dedicated videod a t a l o g g e r , w h i c h r e c o r d s t h e d a t a supplied by the Video Data Acquisitionmodule (in the prototype we use JPEGcompression). The module is designedt o h a n d l e o n e o r m o r e c a m e r a s u s i n g V i d e o F o r W i n d o w s s t a n d a r d . T h e came ras can be either directly connectedt o t h e s ys t e m o r a c c e s s i b l e t h r o u g h a UDP network connection (for d etailedc a m e r a p r o t o c o l s p e c i f i c a t i o n s e e Appendix H). The Data Logger is ableto use any ODBC-compliant databases y s t e m . I n t h e p r o t o t yp e w e u s e d M S S Q L S e r v e r , w h i c h i s a p a r t o f t h e Project Kit. Visualization Module The module provides user interface forthe supervisors. It enables them to watch e a c h o f t h e w o r k i n g o p e r a t o r s p h y s i o l o g i c a l condition along with a preview of selected videos o u r c e a n d h i s r e l a t e d s o u n d s t r e a m . A l l t h e in coming alarm messages are instantly signaled tothe supervisor. Moreover, the visualization modulecan be set in the off-line mode, where all the data isfetched from the database. Watching all the recordedphysiological parameters, alarms, video and audiodata the supervisor is able to reconstruct the courseo f t h e s e l e c t e d o p e r a t o r s d u t y ( f o r d e t a i l e d Visualization Module structure see Appendix I). Applications:

1.Engineers at IBM's ffice:smarttags" ResearchCenter in San Jose, CA, report that a number ofl a r g e r e t a i l e r s h a v e i m p l e m e n t e d s u r v e i l l a n c e s y s t e m s that record and i nterp ret custom ermovements, using s o f t w a r e f r o m A l m a d e n ' s BlueEyes research project. BlueEyes is developingways for computers to anticipate users' wants bygathering video data on eye movement and faciale x p r e s s i o n . Y o u r g a z e m i g h t r e s t o n a W e b s i t e heading, for example, and that would prompt yourcomputer to find similar links and to call them up ina n e w window. But the first practical use for t h e r e s e a r c h t u r n s o u t t o b e s n o o p i n g o n s h o p p e r s . BlueEyes software makes sense of what the camerassee to answer key questions for retailers, including,H o w m a n y s h o p p e r s i g n o r e d a p r o m o t i o n ? H o w many stopped? How long did they stay? Did theirf a c e s r e g i s t e r b o r e d o m o r d e l i g h t ? H o w m a n y r e a c h e d f o r t h e i t e m a n d p u t i t i n t h e i r s h o p p i n g carts? BlueEyes works by tracking pupil, eyebrowand mouth movement. When monitoring pupils, thesystem uses a camera and two infrared light sourcesplaced inside the product display. One light source isa l i g n e d w i t h t h e c a m e r a ' s f o c u s ; t h e other is slightly off axis. When the eyelooks into the camera-aligned light, thepupil appears bright to the sensor, andt h e s o f t w a r e r e g i s t e r s t h e c u s t o m e r ' s a t t e n t i o n . t h i s i s w a y i t c a p t u r e s t h e person's income and buying preferences.BlueEyes is actively been incorporatedi n s o m e o f t h e l e a d i n g r e t a i l o u t l e t s . 2. Another application would be in theautomobile industry. By simply touchinga c o m p u t e r i n p u t d e v i c e s u c h a s a mouse, the computer system is designedt o b e a b l e t o d e t e r m i n e a p e r s o n ' s e m o t i o n a l s t a t e . f o r c a r s , i t c o u l d b e u s e f u l t o h e l p w i t h c r i t i c a l d e c i s i o n s like:" I k n o w yo u w a n t t o g e t i n t o t h e f a s t lane, but I'm afraid I can't do that.Yourtoo upset right now" and therefore assisti n d r i v i n g s a f e l y . 3. Current interfaces between computersa n d h u m a n s c a n p r e s e n t i n f o r m a t i o n vividly, but have no sense of whethert h a t i n f o r m a t i o n i s e v e r v i e w e d o r understood. In contrast, n ew real-timecomputer, what activity is occuring, andt h e r e f o r e w h a t d i s p l a y o r m e s s a g i n g modalities are most appropriate to use inthe current situation. The results of ourresearch will allow the interface betweencomputers and human users to becomem o r e n a t u r a l a n d i n t u i t i v e . 4. We could see its use in video gamesw h e r e , i t c o u l d g i v e i n d i v i d u a l challenges to customer s playing video games.Typically targeting commercial business.The integration of children's toys, technologies andcomputers is enabling new play experiences thatwere not commercially feasible until recently. TheIntel Play QX3 Computer Microscope, the Me2Camwith Fun Fair, and the Computer Sound Morpher arec o m m e r c i a l l y a v a i l a b l e s m a r t t o y p r o d u c t s d e v e l o p e d b y t h e I n t e l S m a r t T o y L a b i n . O n e theme that is common across these PCconnectedt o y s i s t h a t u s e r s i n t e r a c t w i t h t h e m u s i n g a combinat ion of visual, audible and tactile input &output modalities. The presentation will provide anoverview of the interaction design of these productsand pose some unique challenges faced by designersand engineers of such experiences targeted

at novicec o m p u t e r u s e r s , n a m e l y y o u n g c h i l d r e n . 5 . T h e f a m i l i a r a n d u s e f u l c o m e f r o m t h i n g s w e recognize. Many of our favorite things' appearancec o m m u n i c a t e t h e i r u s e ; t h e y s h o w t h e c h a n g e i n their value though patina. As technologists we arenow poised to imagine a world where computingobjects communicate with us in-situ; where we are.We use our looks, feelings, and actions to give thecomputer the experience it needs to work with us.Keyboards and mice will not continue to dominatecomputer user interfaces. Keyboard input will ber e p l a c e d i n l a r g e m e a s u r e b y s ys t e m s t h a t k n o w w h a t w e w a n t a n d r e q u i r e l e s s e x p l i c i t communication. Se nsors are gaining fidelity andubiquity to record presence and actions; sensors willnotice when we enter a space, sit down, lie down,pump iron, etc. Pervasive infrastructure is recordingit. This talk will cover projects from the ContextAware Computing Conclusion: The wireless link between the sensorsworn by the operator and the supervisings ys t e m m a k e s i t p o s s i b l e t o i m p r o v e o v e r a l l r e l i a b i l i t y, s a f e t y a n d a s s u r e s proper Quality of system performance.These new possibilities can cover areassuch as industry, transportation, militarycommand centers or operation theaters.Researchers are attempting to add morecapabilities to computers that will allowthem to interact like humans, recognizeh u m a n p r e s e n t s , t a l k , l i s t e n , o r e v e n g u e s s t h e i r f e e l i n g s . It avoids potential threats resulting fromh u m a n e r r o r s , s u c h a s w e a r i n e s s , oversight,tiredness.

Você também pode gostar