You are on page 1of 23

ABSTRACT

Is it possible to create a computer, which can interact with us as we interact each other? For example imagine in a fine morning you walk on to your computer room and switch on your computer, and then it tells you Hey friend, good morning you seem to be a bad mood today. And then it opens your mail box and shows you some of the mails and tries to cheer you. It seems to be a fiction, but it will be the life lead by BLUE EYES in the very near future. The basic idea behind this technology is to give the computer the human power. We all have some perceptual abilities. That is we can understand each others feelings. For example we can understand ones emotional state by analyzing his facial expression. If we add these perceptual abilities of human to computers would enable computers to work together with human beings as intimate partners. The BLUE EYES technology aims at creating computational machines that have perceptual and sensory ability like those of human beings.

INTRODUCTION Imagine yourself in a world where humans interact with computers. You are sitting in front of your personal computerthat can listen, talk, or even scream aloud. It has the ability to gather information about you and interact with you through special techniques like facial recognition, speech recognition, etc. It can even understand your emotions at the touch of the mouse. It verifies your identity, feels your presents, and starts interacting with you .You ask the computer to dial to your friend at his office. It realizes the urgency of the situation through the mouse, dials your friend at his office, and establishes a connection. Human cognition depends primarily on the ability to perceive, interpret, and integrate audio-visuals and sensoring information. Adding extraordinary perceptual abilities to computers would enable computers to work together with human beings as intimate partners. Researchers are attempting to add more capabilities to computers that will allow them to interact like humans, recognize human presents, talk, listen, or even guess their feelings.

The BLUE EYES technology aims at creating computational machines that have perceptual and sensory ability like those of human beings. It uses non-obtrusige sensing method, employing most modern video cameras and microphones to identifies the users actions through the use of imparted sensory abilities . The machine can understand what a user wants, where he is looking at, and even realize his physical or emotional state

EMOTION MOUSE One goal of human computer interaction (HCI) is to make an adaptive, smart computer system. This type of project could possibly include gesture recognition, facial recognition, eyetracking, speech recognition, etc. Another non-invasive way to obtain information about a person is through touch. People use their computers to obtain, store and manipulate data using their computer. In order to start creating smart computers, the computer must start gaining information about the user. Our proposed method for gaining user information through touch is via a computer input device, the mouse. From the physiological data obtained from the user, an emotional state may be determined which would then be related to the task the user is currently doing on the computer. Over a period of time, a user model will be built in order to gain a sense of the user's personality. The scope of the project is to have the computer adapt to the user in order to create a better working environment where the user is more productive. The first steps towards realizing this goal are described here.

2.1 EMOTION AND COMPUTING

Rosalind Picard (1997) describes why emotions are important to the computing community. There are two aspects of affective computing: giving the computer the ability to detect emotions and giving the computer the ability to express emotions. Not only are emotions crucial for rational decision making as Picard describes, but emotion detection is an important step to an adaptive computer system. An adaptive, smart computer system has been driving our efforts to detect a persons emotional state. An important element of incorporating emotion into computing is for productivity for a computer user. A study (Dryer & Horowitz, 1997) has shown that people with personalities that are similar or complement each other collaborate well. Dryer (1999) has also shown that people view their computer as having a personality. For these reasons, it is important to develop computers which can work well with its user.

By matching a persons emotional state and the context of the expressed emotion, over a period of time the persons personality is being exhibited. Therefore, by giving the computer a longitudinal understanding of the emotional state of its user, the computer could adapt a working style which fits with its users personality. The result of this collaboration could increase productivity for the user. One way of gaining information from a user non-intrusively is by video. Cameras have been used to detect a persons emotional state (Johnson, 1999). We have explored gaining information through touch. One obvious place to put sensors is on the mouse. Through observing normal computer usage (creating and editing documents and surfing the web), people spend approximately 1/3 of their total computer time touching their input device. Because of the incredible amount of time spent touching an input device, we will explore the possibility of detecting emotion through touch.

2.2 THEORY

Based on Paul Ekmans facial expression work, we see a correlation between a persons emotional state and a persons physiological measurements. Selected works from Ekman and others on measuring facial behaviors describe Ekmans Facial Action Coding System (Ekman and Rosenberg, 1997). One of his experiments involved participants attached to devices to record certain measurements including pulse, galvanic skin response (GSR), temperature, somatic movement and blood pressure. He then recorded the measurements as the participants were instructed to mimic facial expressions which corresponded to the six basic emotions. He defined the six basic emotions as anger, fear, sadness, disgust, joy and surprise. From this work, Dryer (1993) determined how physiological measures could be used to distinguish various emotional states.

Six participants were trained to exhibit the facial expressions of the six basic emotions. While each participant exhibited these expressions, the physiological changes associated with affect were assessed. The measures taken were GSR, heart rate, skin temperature

and general somatic activity (GSA). These data were then subject to two analyses. For the first analysis, a multidimensional scaling (MDS) procedure was used to determine the dimensionality of the data. This analysis suggested that the physiological similarities and dissimilarities of the six emotional states fit within a four dimensional model. For the second analysis, a discriminant function analysis was used to determine the mathematic functions that would distinguish the six emotional states. This analysis suggested that all four physiological variables made significant, nonredundant contributions to the functions that distinguish the six states. Moreover, these analyses indicate that these four physiological measures are sufficient to determine reliably a persons specific emotional state. Because of our need to incorporate these measurements into a small, non-intrusive form, we will explore taking these measurements from the hand. The amount of conductivity of the skin is best taken from the fingers. However, the other measures may not be as obvious or robust. We hypothesize that changes in the temperature of the finger are reliable for prediction of emotion. We also hypothesize the GSA can be measured by change in movement in the computer mouse. Our efforts to develop a robust pulse meter are not discussed here.

ABSTRACT The computers we use in our day-to-day life have tremendous abilities to sophisticated tasks easily.They can do the tasks assigned to them with a lightning speed. They can understand a variety of computer languages. They can effectively compile the programs written in these languages and understand what to do. Despite their lightning speed and awesome powers of computation, today's PCs are essentially deaf, dump and blind. If wc want our computers to be genuinely intelligent and interact naturally with us, we must give them the power to recognize and understand emotions. Imagine ourselves in a world where human interact with computers in the same way as with humans. It has the ability to gather information about us and interact accordingly. It can even understand the emotions of it's user at the touch of a mouse. This can be brought into reality with the help of the upcoming technology The BLUE EYES. For the implementation of The Blue Eyes technology we must give our personnel computers the power to recognize and understand our

emotions. The search for a unique and reliable technique for securing resources has finally led to the realization that human physiological traits are unique enough for it to be used as an identifier. The thing lacking was the technology to exploit the potential of these characteristics. The new technology, BIOMETRICS emerged as a solution for this situation. Thus BLUE EYES used biometric sensors can make incredible effect in the field of emotion detection which itself can act as a security measure. 1. INTRODUCTION The BLUE EYES project was started at IBM's Almaden Research Centre in USA. It aims at giving computers highly developed abilities to perceive, integrate and interpret visual, auditory and touch information. The project explores various ways of allowing people to operate computers without conscious effort. Gaze tracking seemed like the natural place to start. This is because the eyes are the most expressive part in a human being and all the emotions are reflected in the eyes. These days it is the humans who have to adapt to the computers by learning various languages, modes of operations etc. The BLUE EYES project aims at creating computers, which can adapt to humans and thus enable them to operate much more conveniently. The project enables computers and humans to work together more as partners. Effective utilization of existing Biometric techniques for the purpose of the emotion detection is done in the Blue Eyes project. Different Biometric sensors are used to monitor different parts of the human body. By proper processing of the output of these sensors the emotion of a person is identified. The Blue Eyes uses non-obtrusive sensing technologies such as video cameras and microphones, to identify user's action and to extract key information. These clues are analyzed to determine the user's physical, emotional or informational state. 2. BIOMETRICS The computers must be given power to sense emotion of its user, in order to make them 'attentive computers'. BIOMETRICS is the science used for the implementation. It is the science by which we measure the physiological and behavioral characteristics of a person. And by using these characteristics the emotional state of the user is identified. The eyes are the most expressive part of a human being. So iris scanning is the most important technique used. The emotions of a human being will directly reflect in his physiological attributes. So the measure of heartbeat, blood pressure etc. will give straight information about the emotional state of the user. Several techniques like eye gaze tracking, facial expression detection, speech recognition, detection using Emotion mouse, Jazz multi sensor etc. are used for emotion detection. Besides the emotion detection, the Biometric techniques can be used for security purposes. The physical characteristics like the finger prints, hand geometry, retina, voice etc. are unique for every human being. Thus by analyzing these characteristics a person can be easily identified. Thus several methods like fingerprint identification, retinal

scan, voice identification etc are successfully implemented for the purpose of security. Here in the implementation of Blue Eyes, the biometric techniques are used mainly for the purpose of emotion detection. 3. MAIN BIOMETRIC TECHNIQUES 3.1 Facial expression detection The facial expression detection consists of two steps - The Face recognition and Expression detection. Face Recognition Face recognition is applied in a variety of domains, predominantly for security. The user's face must be identified before further processing. The first problem to be solved before attempting face recognition is to find the face in the image. The first stage of the process is color segmentation, which simply determines if the proportion of the skin tone pixels is greater than some threshold. Subsequently candidate regions are given scores.Next instead of searching for all the facial features directly in the face image, a few 'high level' features (eyes, nose, mouth) are first located. Then other 26 'low level' features that may be parts of eyes, nose, mouth, eyebrows etc. are located with respect to high level feature locations. The approximate locations of the high level features are known from statistics of mean and variance relative to the nose position, gathered on the training database Expression Detection The facial expression determination is a field in which fast researches are going on. The most intriguing invention is Expression Glasses. This is a mobile device which can be worn by the user. This is very comfortable to wear and the survey about it among the subject users provide a good result. This device measures the movement of face muscles. The movement is then compared with some reference index to determine emotion. This device is used to determine the user's level of interest or confusion by measuring the movement of muscles around the eyes. The output from the expression glasses is fed to the computer for further processing. There it is converted into a two colored bar graph, in which red bars indicate confusion and green for interest. This graph will give a clear indication about the level of interest or confusion. 3.2 Speech Recognition Speech recognition is the process of converting a speech signal to a set of words, by means of an algorithm implemented as computer program. Voice or speaker identification is a related process that attempts to identify the person speaking, as opposed to what is being said. Speech is processed by means of complex voice processing algorithms. First the speech signal is converted into a set of words, by proper sampling, quantization and coding. These words are called voice prints. There is already a reference index which contains different voice prints

corresponding to each emotion. By comparing the subject user's voice print with these reference the emotion is identified. Mainly the tone of the voice is compared, besides what is being said. 3.3 Emotion Mouse A non-invasive way to obtain the information about the user's emotional state is through touch. People use their computer to store and manipulate data. The proposed method for obtaining user information through touch is via a computer input device, the mouse. The computer determines the user's emotional state by a simple touch. Sensors in the mouse sense physiological attributes, which are correlated to emotions using a correlation model. The emotion mouse consists of a number of sensors which will sense individual attributes. The different sensors incorporated in the emotion mouse are IR sensor, thermosister chip, galvanic sensor. The IR sensor will measure the heartbeat from the fingertip, the thermosister chip will measure the body temperature and galvanic sensor will take the measure of skin conductivity. All these attributes are combined to form a vector which is the representative of the emotional state of the user. This vector is used to determine the present emotion of the user. This vector is being compared with the emotion-to- attribute correlation model and the emotion detection is realized. 3.4 Eye Gaze Tracking The eye gaze tracking system monitors the eye movement of the user to detect his emotional state. This system uses a technique called Pupil finder to monitor subject user's eye movement. 3.5 Jazz Multi Sensor Jazz multi sensor is a sensor which will sense multi attributes. This sensor is a mobile device that the user can wear it on his forehead. This device is a marvelous one which has multi sensors incorporated on it. This device uses all the techniques of emotion detection described above. This single device is capable of detecting a subject user's emotional state. This device is developed at the research laboratory of Poznan University, Poland. The different sensors in a Jazz sensor are IR sensor, oculographic transducer, environment illumination sensor, expression glass, microphone etc. The different sensors senses different physiological attributes which will give a complex output. The plethysmographic signals, which are the signals from cardiac, circulatory and pulmonary systems, will give direct indication of the emotional state. The sensors sensing these signals are collectively known as plethysmograhic transducers. The IR sensor senses the heartbeat and level of blood oxygenation. The heartbeat pulse rate is calculated at the analysis section by making use of the level of oxyhaemoglobin and de-oxyhaemoglobin sensed by the Jazz sensor. The voice data is sensed by microphone. The audio signal corresponding to this voice data is properly coded and transmitted to

the analysis unit. The expression glasses will monitor the muscle movement around eyes, and the facial expression of the subject user is examined. The saccades are the most abrupt eye movement It will directly give indication about level of visual attention of the user. The eye movement is monitored by oculographic transducers. The implement the principle eye gaze tracking. The two axial accelerometers provided will sense the velocity of eye motion along with head acceleration. The sensor provided for detecting eye movement are optical transducers. So for getting accurate processed data about eye position and eye movement ,we need some information about the operating room. The environmental illumination sensor will serve this purpose. It will provide data regarding the light conditions of the room. The Jazz sensor thus senses different physiological attributes which will point to the emotional state of the user. The separate attributes will have different values and hence they have to be appropriately multiplexed. The Jazz sensor will take measure of saccadic activity in an interval of every lKHz and other parameters at an interval of 250Hz. Thus they are properly transmitted for suitable analysis to acquire emotion detection. 4. BLUE EYES HARDWARE 4.1 System Overview Blue eyes system provides technical means for monitoring and recording the operator's physiological parameters . The Blue Eyes hardware consists of mainly two parts. They are the Data Acquisition Unit and The Central System Unit. Data Acquisition Unit is a mobile measuring device, which consists of a number of modules like physiological parameter sensor, voice interface, ID card interface etc. ID card assigned to each of the operators and adequate user profdes on the central system unit provide necessary data personalization, so different people can use a single mobile device. The mobile device is integrated with Bluetooth module for providing wireless interface between sensors worn by the operator and the central unit. Central System Unit actually provides the real-time buffering of incoming sensor signals and semi-real-time processing of the data. It consists of different data modules for the proper functioning . The overall system diagram is shown below. The explanation for individual components is as follows. Overall picture diagram 4.2 Data Acquisition Unit Data Acquisition Unit is the mobile part of the Blue Eyes hardware. Main tasks of mobile Data Acquisition Unit are to maintain Bluetooth connections, receive the information from the sensor and sending it over the wireless connection, to deliver the alarm messages send from the central system unit to the operator and handle personalized ID

cards. The components which constitute the data acquisition unit are Atmel 89C52, a PCM Codec, personal ID card interface, Jazz multisensor interface, a beeper, an LCD display, LED indicators a simple keyboard, and finally a Bluettoth modue. The arrangement of these blocks to realize a DAU is picturized below. Microphone earphone f^f v:: --;-;3 'CM sstiec AZZ V jl:iss isc 3 : |:'ri^ ooooooo ooooooo ooooooooo Atmel 8952 microc-critrcller

The DAU of the Blue Eyes hardware composed of the different components as explained below. a. Personnel ID card interface. ID cards are assigned to each of the operators and corresponding adequate user profiles on the central system unit provides necessary data personalization, so different people can use the single mobile DAU. In order to start the proper functioning of the DAU the operator should insert the personal ID card. After inserting the ID card into the mobile device and entering proper PIN code, the device will start listening for incoming Bluetooth connections. Once the connection has been established and authorization process is succeeded (PIN code is correct) central system start monitoring the output of the DAU. b. Jazz multi sensor The multi sensor will sense different physiological parameters and send the complex output signal to DAU. This signal is received, properly coded and send to the central system unit over the Bluetooth connection. The Jazz sensor is properly interfaced to the microcontroller. c. Bluetooth module The ROK 101008 Bluetooth module is used here to establish a wireless connection between a mobile DAU and the stationary central system unit. The Bluetooth module provides wireless connection between two transceivers in a range of about 10m ie the vicinity of a room. d. Atmel 89C52 Microcontroller The Atmel 89C52 is the microcontroller which forms the core of the

DAU. It enhances serial data transmission, bidirectionally. The 89C52 is chosen since it is has well established industrial standards and it provide necessary functionality. It has a very high speed serial port which makes the serial data transmission smooth. All the DAU software is written in the 8051 assembler code and the assembler used here is the AS-31. This ensures highest program efficiency and lowest resource utilization. e. PCM Codec Since the Bluetooth module supports the synchronous voice data transmission, the Blue Eyes hardware uses PCM Codec to transmit operator's voice and central system sound feedback. The codec employed reduces the microcontroller's task and lessens the amount of data being send over the UART. The PCM Codec performs voice data compression, which results in smaller bandwidth utilization and better sound quality. The codec used here is Motorola MC 145483 linear 13 bit 3.3V codec which is voltage level compatible with Bluetooth module. f. Beeper, LED, LCD and Keyboard The central system unit will sense the user defined alarm conditions from the input given to it by the DAU. The beeper will produce alarm sound when the exceptions are detected. It is used to inform the user or his colleagues about the exceptions detected. A simple keyboard is provided to react to the incoming events, ie for example to silence the alarm sound. This keyboard is also used to enter the personal PIN code, while performing the authorization procedure. The alphanumeric LCD display gives more information about the alarm conditions. It helps the user to enter his personal PIN number accurately. In the authorization process ,if the user is not entering his ID card, the system will undergo a self test and the LED indicators associated with the DAU will shows the output of the self test. It also shows the current power level and the state of wireless connection. An assembled DAU which is developed in the Poznan University is shown below. DAU stores the received data from the sensor in an internal buffer, after the whole frame is completed it encapsulates the data in an ACL frame and send it over the Bluetooth link. The fetching phase takes up approximate 192us ( 8us x 24 frames) and the sending phase takes at 115200 bps approximately. In the DAU there are two independent data sources- the Jazz sensor and the Bluetooth host controller. Since both are handled using the interrupt system, it is necessary to decide which of the source should have higher priority. Giving the sensor data the highest priority may results in losing some of the data send by the Bluetooth module, as the transmission of the sensor data takes twice as much time as receiving one byte data from UART. Missing a single byte data send from the

Bluetooth causes the lose of control over the transmission. On other hand, giving the Bluetooth the highest priority will make the DAU stop receiving the sensor data until the host controller finishes its transmission.In this case the interrupted sensor frame shall be discarded. We do not consider the data being send from the DAU to the Bluetooth as it does not affect the operation. Since the missing 1 byte of Bluetooth communication affects the functioning of DAU much more than losing one single sensor data frame, the high priority is obviously given to the Bluetooth link. 4.3 Central System Unit The central system unit constitute the second peer of wireless connection. This box contains the Bluetooth transceiver and PCM codec for voice data transmission. The main functions of the CSU includes maintaining Bluetooth connection, buffering incoming sensor data, performing online data analysis, recording the results for further reference and providing visualization. The main parts which constitute the CSU are connection manager, data logger module, data analysis module, alarm dispatcher module and visualization module. The connection manager forms the front end of the CSU which receives the incoming data. The data logger module buffers the incoming data and the processed information. The data analysis module processes the output of the sensor and identifies the emotion. The incoming data is loaded into the alarm dispatcher module together with the data logger. This module checks the alarm conditions and produce suitable output. The visualization module provides a convenient way for the supervisor to access the database. CSU - intermodule communication C onnection Manager Ope ra tor manager (OM) ager(OM) ... OM OM Separated physiological data streams 1 Data Analysis Visualization Module D ata Logger tr Processed data Recorded (off-line) data The individual units which constitute the CSU are explained below. a. Connection Manager Connection Manager's main task is to perform low-level Bluetooth

communication using host Controller Interface commands. It is designed to cooperate with all available Bluetooth devices. Additionally, Connection Manager authorizes operators, manages their sessions, demultiplexes and buffers raw physiological data. Bluetooth Connection Manager is responsible for establishing and maintaining connections using all available Bluetooth devices. It periodically inquires new devices in an operating range and checks whether they are registered in the system database. Only with those devices, the Connection Manager will communicate. After establishing a connection an authentication procedure occurs. The authentication process is performed using system PIN code fetched from the database. Once the connection has been authenticated the mobile unit sends a data frame containing the operator's identifier. Finally, the Connection Manager adds a SCO link ( voice connection) and runs a new dedicated Operator Manager, which will manage the new operator's session.. b. Operator Manager The data of each supervised operator is buffered separately in the dedicated Operator Manager. At the startup it communicates with the Operator Data Manager in order to get more detailed personal data. The most important Operator Manager's task is to buffer the incoming raw data and to split it into separate data streams related to each of the measured parameters. The raw data is sent to a Logger Module, the split data streams are available for the other system modules through producer-consumer queues. Furthermore, the Operator Manager provides an interface for sending alert messages to the related operator. Operator Data Manager provides an interface to the operator database enabling the other modules to read or write personal data and system access information. c. Data Logger Module The module provides support for storing the monitored data in order to enable the supervisor to reconstruct and analyze the course of the operator's duty. The module registers as a consumer of the data to be stored in the database. Each working operator's data is recorded by a separate instance of the Data Logger. Apart from the raw or processed physiological data, alerts and operator's voice are stored. The raw data is supplied by the related Operator Manager module, whereas the Data Analysis module delivers the processed data. The voice data is delivered by a Voice Data Acquisition module. The module registers as an operator's voice data consumer and optionally processes the sound to be stored (i.e. reduces noise or removes the fragments when the operator does not speak). The Logger's task is to add appropriate time stamps to enable the system to reconstruct the voice. d. Data Analysis Module The module performs the analysis of the raw sensor data in order to obtain information about the operator's physiological condition. The separately running Data Analysis Module supervises each of the

working operators. The module consists of a number of smaller analyzers extracting different types of information. Each of the analyzers registers at the appropriate Operator Manager or another analyzer as a data consumer and, acting as a producer, provides the results of the analysis e. Alarm Dispatcher Module Alarm Dispatcher Module is a very important part of the Data Analysis module. It registers for the results of the data analysis, checks them with regard to the user-defined alarm conditions and launches appropriate actions when needed. The module is a producer of the alarm messages, so that they are accessible in the logger and visualization modules. f. Visualization Module The module provides user interface for the supervisors. It enables them to watch each of the working operator's physiological condition along with a preview of selected video source and his related sound stream. All the incoming alarm messages are instantly signaled to the supervisor. Moreover, the visualization module can be set in the off-line mode, where all the data is fetched from the database. Watching all the recorded physiological parameters, alarms, video and audio data the supervisor is able to reconstruct the course of the selected operator's duty. 5. APPLICATIONS OF BLUE EYES BLUE EYES enables the computer to adapt a working style that fits users personality to increase his productivity The technology can also be incorporated into automobiles and the interactive entertainment such as toys etc. An alarm can be devised in the steering of a vehicle to warn the driver, if his stress level goes beyond the critical level. Also, Blue Eyes is going to find its way in the routines of human beings with application such as TV, washing machine etc. Also if we take case of a company, the supervisor can be given the power to access the database containing his employees profile, which is linked with Blue Eyes technology. Thus he can check the emotional state of any of his worker at any time at his own will. Thereby he can evaluate the performance of his employees. For example, the boss can monitor the saccadic activity of a worker at his night duty time, and if the saccadic activity is smaller, the boss can infer that the worker is sleeping. He can trigger an alarm at this condition or can inform others to wake him up. 6. FUTURE SCOPE We find it possible still to improve our project. The use of a miniature CMOS camera integrated into the eye movement sensor will enable the system to calculate the point of gaze and observe what the operator is actually looking at. Introducing voice recognition algorithm will facilitate the communication between the operator and the central system and simplify authorization process.

Future applications of this technology is limitless -from designing cars to controlling your household devices. 7. CONCLUSION The BLUE EYES is the most modern technology, which dealt with giving the computers emotional intelligence. This innovation can make our life so simple that anybody can operate any household devices, sophisticated machines, manufacturing machines in industries which have complicated operating procedures etc without much conscious effort. By the implementation of this technology we can have devices which will do our tasks when we speak to them. We will work with our personal computer which can hear us, speak to us and even scream aloud. This amazing technology will simplify our life by providing more delicate means to operate our devices. Thus the BLUE EYES will find its way to our day-to-day life and will become an integral part of it.

2. 3. 4. http://www.wikipedia.org ,ets ROK101008 Bluetooth Module, Erickson Microelectronics CONTENTS 1. INTRODUCTION .1 2. BIOMETRICS 2 3. MAIN BIOMETRIC TECHNIQUES 3 3.1 FACIAL EXPRESSION DETECTION 3 3.2 SPEECH RECOGNITION 4 3.3 EMOTION MOUSE 4 3.4 EYE GAZE TRACKING 5 3.5 JAZZ MULTISENSOR 5 4. BLUE EYES HARDWARE 8 4.1 SYSTEM OVERVIEW 8 4.2 DATA ACQUISITION UNIT 10 4.3 CENTRAL SYSTEM UNIT 14 5. APPLICATION 18 6. FUTURE SCOPE 19 7. CONCLUSION 20 8. REFERENCE 21

Reference: http://seminarprojects.com/Thread-blue-eyes-download-full-report-andabstract?page=2#ixzz1nThMbgON

ABSTRACT The computers we use in our day-to-day life have tremendous abilities to sophisticated tasks easily.They can do the tasks assigned to them with a lightning speed. They can understand a

variety of computer languages. They can effectively compile the programs written in these languages and understand what to do. Despite their lightning speed and awesome powers of computation, today's PCs are essentially deaf, dump and blind. If wc want our computers to be genuinely intelligent and interact naturally with us, we must give them the power to recognize and understand emotions. Imagine ourselves in a world where human interact with computers in the same way as with humans. It has the ability to gather information about us and interact accordingly. It can even understand the emotions of it's user at the touch of a mouse. This can be brought into reality with the help of the upcoming technology The BLUE EYES. For the implementation of The Blue Eyes technology we must give our personnel computers the power to recognize and understand our emotions. The search for a unique and reliable technique for securing resources has finally led to the realization that human physiological traits are unique enough for it to be used as an identifier. The thing lacking was the technology to exploit the potential of these characteristics. The new technology, BIOMETRICS emerged as a solution for this situation. Thus BLUE EYES used biometric sensors can make incredible effect in the field of emotion detection which itself can act as a security measure. 1. INTRODUCTION The BLUE EYES project was started at IBM's Almaden Research Centre in USA. It aims at giving computers highly developed abilities to perceive, integrate and interpret visual, auditory and touch information. The project explores various ways of allowing people to operate computers without conscious effort. Gaze tracking seemed like the natural place to start. This is because the eyes are the most expressive part in a human being and all the emotions are reflected in the eyes. These days it is the humans who have to adapt to the computers by learning various languages, modes of operations etc. The BLUE EYES project aims at creating computers, which can adapt to humans and thus enable them to operate much more conveniently. The project enables computers and humans to work together more as partners. Effective utilization of existing Biometric techniques for the purpose of the emotion detection is done in the Blue Eyes project. Different Biometric sensors are used to monitor different parts of the human body. By proper processing of the output of these sensors the emotion of a person is identified. The Blue Eyes uses non-obtrusive sensing technologies such as video cameras and microphones, to identify user's action and to extract key information. These clues are analyzed to determine the user's physical, emotional or informational state. 2. BIOMETRICS The computers must be given power to sense emotion of its user, in order to make them 'attentive computers'. BIOMETRICS is the science used for the implementation. It is the science by which we measure the physiological and behavioral characteristics of a person. And by using these characteristics the emotional state of the user is identified. The eyes are the most expressive part of a human being. So iris scanning is the most important technique used. The emotions of a human being will directly reflect in his physiological attributes. So the measure of heartbeat, blood pressure etc. will give straight information about the emotional state of the user. Several techniques like eye gaze tracking, facial expression detection, speech recognition, detection using Emotion mouse, Jazz multi sensor etc. are used for emotion detection. Besides the emotion detection, the Biometric techniques can be used for security purposes. The physical characteristics like the finger prints, hand geometry, retina, voice etc. are unique for every human being. Thus by analyzing these characteristics a person can be easily

identified. Thus several methods like fingerprint identification, retinal scan, voice identification etc are successfully implemented for the purpose of security. Here in the implementation of Blue Eyes, the biometric techniques are used mainly for the purpose of emotion detection. 3. MAIN BIOMETRIC TECHNIQUES 3.1 Facial expression detection The facial expression detection consists of two steps - The Face recognition and Expression detection. Face Recognition Face recognition is applied in a variety of domains, predominantly for security. The user's face must be identified before further processing. The first problem to be solved before attempting face recognition is to find the face in the image. The first stage of the process is color segmentation, which simply determines if the proportion of the skin tone pixels is greater than some threshold. Subsequently candidate regions are given scores.Next instead of searching for all the facial features directly in the face image, a few 'high level' features (eyes, nose, mouth) are first located. Then other 26 'low level' features that may be parts of eyes, nose, mouth, eyebrows etc. are located with respect to high level feature locations. The approximate locations of the high level features are known from statistics of mean and variance relative to the nose position, gathered on the training database Expression Detection The facial expression determination is a field in which fast researches are going on. The most intriguing invention is Expression Glasses. This is a mobile device which can be worn by the user. This is very comfortable to wear and the survey about it among the subject users provide a good result. This device measures the movement of face muscles. The movement is then compared with some reference index to determine emotion. This device is used to determine the user's level of interest or confusion by measuring the movement of muscles around the eyes. The output from the expression glasses is fed to the computer for further processing. There it is converted into a two colored bar graph, in which red bars indicate confusion and green for interest. This graph will give a clear indication about the level of interest or confusion. 3.2 Speech Recognition Speech recognition is the process of converting a speech signal to a set of words, by means of an algorithm implemented as computer program. Voice or speaker identification is a related process that attempts to identify the person speaking, as opposed to what is being said. Speech is processed by means of complex voice processing algorithms. First the speech signal is converted into a set of words, by proper sampling, quantization and coding. These words are called voice prints. There is already a reference index which contains different voice prints corresponding to each emotion. By comparing the subject user's voice print with these reference the emotion is identified. Mainly the tone of the voice is compared, besides what is being said. 3.3 Emotion Mouse A non-invasive way to obtain the information about the user's emotional state is through touch. People use their computer to store and manipulate data. The proposed method for obtaining user information through touch is via a computer input device, the mouse. The computer determines the user's emotional state by a simple touch. Sensors in the mouse sense physiological attributes, which are correlated to emotions using a correlation model. The emotion mouse consists of a number of sensors which will sense individual attributes.

The different sensors incorporated in the emotion mouse are IR sensor, thermosister chip, galvanic sensor. The IR sensor will measure the heartbeat from the fingertip, the thermosister chip will measure the body temperature and galvanic sensor will take the measure of skin conductivity. All these attributes are combined to form a vector which is the representative of the emotional state of the user. This vector is used to determine the present emotion of the user. This vector is being compared with the emotion-to- attribute correlation model and the emotion detection is realized. 3.4 Eye Gaze Tracking The eye gaze tracking system monitors the eye movement of the user to detect his emotional state. This system uses a technique called Pupil finder to monitor subject user's eye movement. 3.5 Jazz Multi Sensor Jazz multi sensor is a sensor which will sense multi attributes. This sensor is a mobile device that the user can wear it on his forehead. This device is a marvelous one which has multi sensors incorporated on it. This device uses all the techniques of emotion detection described above. This single device is capable of detecting a subject user's emotional state. This device is developed at the research laboratory of Poznan University, Poland. The different sensors in a Jazz sensor are IR sensor, oculographic transducer, environment illumination sensor, expression glass, microphone etc. The different sensors senses different physiological attributes which will give a complex output. The plethysmographic signals, which are the signals from cardiac, circulatory and pulmonary systems, will give direct indication of the emotional state. The sensors sensing these signals are collectively known as plethysmograhic transducers. The IR sensor senses the heartbeat and level of blood oxygenation. The heartbeat pulse rate is calculated at the analysis section by making use of the level of oxyhaemoglobin and de-oxyhaemoglobin sensed by the Jazz sensor. The voice data is sensed by microphone. The audio signal corresponding to this voice data is properly coded and transmitted to the analysis unit. The expression glasses will monitor the muscle movement around eyes, and the facial expression of the subject user is examined. The saccades are the most abrupt eye movement It will directly give indication about level of visual attention of the user. The eye movement is monitored by oculographic transducers. The implement the principle eye gaze tracking. The two axial accelerometers provided will sense the velocity of eye motion along with head acceleration. The sensor provided for detecting eye movement are optical transducers. So for getting accurate processed data about eye position and eye movement ,we need some information about the operating room. The environmental illumination sensor will serve this purpose. It will provide data regarding the light conditions of the room. The Jazz sensor thus senses different physiological attributes which will point to the emotional state of the user. The separate attributes will have different values and hence they have to be appropriately multiplexed. The Jazz sensor will take measure of saccadic activity in an interval of every lKHz and other parameters at an interval of 250Hz. Thus they are properly transmitted for suitable analysis to acquire emotion detection. 4. BLUE EYES HARDWARE 4.1 System Overview Blue eyes system provides technical means for monitoring and recording the operator's

physiological parameters . The Blue Eyes hardware consists of mainly two parts. They are the Data Acquisition Unit and The Central System Unit. Data Acquisition Unit is a mobile measuring device, which consists of a number of modules like physiological parameter sensor, voice interface, ID card interface etc. ID card assigned to each of the operators and adequate user profdes on the central system unit provide necessary data personalization, so different people can use a single mobile device. The mobile device is integrated with Bluetooth module for providing wireless interface between sensors worn by the operator and the central unit. Central System Unit actually provides the real-time buffering of incoming sensor signals and semi-real-time processing of the data. It consists of different data modules for the proper functioning . The overall system diagram is shown below. The explanation for individual components is as follows. Overall picture diagram 4.2 Data Acquisition Unit Data Acquisition Unit is the mobile part of the Blue Eyes hardware. Main tasks of mobile Data Acquisition Unit are to maintain Bluetooth connections, receive the information from the sensor and sending it over the wireless connection, to deliver the alarm messages send from the central system unit to the operator and handle personalized ID cards. The components which constitute the data acquisition unit are Atmel 89C52, a PCM Codec, personal ID card interface, Jazz multisensor interface, a beeper, an LCD display, LED indicators a simple keyboard, and finally a Bluettoth modue. The arrangement of these blocks to realize a DAU is picturized below. Microphone earphone f^f v:: --;-;3 'CM sstiec AZZ V jl:iss isc 3 : |:'ri^ ooooooo ooooooo ooooooooo Atmel 8952 microc-critrcller

The DAU of the Blue Eyes hardware composed of the different components as explained below. a. Personnel ID card interface. ID cards are assigned to each of the operators and corresponding adequate user profiles on the central system unit provides necessary data personalization, so different people can use the single mobile DAU. In order to start the proper functioning of the DAU the operator should insert the personal ID card. After inserting the ID card into the mobile device and entering

proper PIN code, the device will start listening for incoming Bluetooth connections. Once the connection has been established and authorization process is succeeded (PIN code is correct) central system start monitoring the output of the DAU. b. Jazz multi sensor The multi sensor will sense different physiological parameters and send the complex output signal to DAU. This signal is received, properly coded and send to the central system unit over the Bluetooth connection. The Jazz sensor is properly interfaced to the microcontroller. c. Bluetooth module The ROK 101008 Bluetooth module is used here to establish a wireless connection between a mobile DAU and the stationary central system unit. The Bluetooth module provides wireless connection between two transceivers in a range of about 10m ie the vicinity of a room. d. Atmel 89C52 Microcontroller The Atmel 89C52 is the microcontroller which forms the core of the DAU. It enhances serial data transmission, bidirectionally. The 89C52 is chosen since it is has well established industrial standards and it provide necessary functionality. It has a very high speed serial port which makes the serial data transmission smooth. All the DAU software is written in the 8051 assembler code and the assembler used here is the AS-31. This ensures highest program efficiency and lowest resource utilization. e. PCM Codec Since the Bluetooth module supports the synchronous voice data transmission, the Blue Eyes hardware uses PCM Codec to transmit operator's voice and central system sound feedback. The codec employed reduces the microcontroller's task and lessens the amount of data being send over the UART. The PCM Codec performs voice data compression, which results in smaller bandwidth utilization and better sound quality. The codec used here is Motorola MC 145483 linear 13 bit 3.3V codec which is voltage level compatible with Bluetooth module. f. Beeper, LED, LCD and Keyboard The central system unit will sense the user defined alarm conditions from the input given to it by the DAU. The beeper will produce alarm sound when the exceptions are detected. It is used to inform the user or his colleagues about the exceptions detected. A simple keyboard is provided to react to the incoming events, ie for example to silence the alarm sound. This keyboard is also used to enter the personal PIN code, while performing the authorization procedure. The alphanumeric LCD display gives more information about the alarm conditions. It helps the user to enter his personal PIN number accurately. In the authorization process ,if the user is not entering his ID card, the system will undergo a self test and the LED indicators associated with the DAU will shows the output of the self test. It also shows the current power level and the state of wireless connection. An assembled DAU which is developed in the Poznan University is shown below. DAU stores the received data from the sensor in an internal buffer, after the whole frame is completed it encapsulates the data in an ACL frame and send it over the Bluetooth link. The fetching phase takes up approximate 192us ( 8us x 24 frames) and the sending phase takes at 115200 bps approximately. In the DAU there are two independent data sources- the Jazz sensor and the Bluetooth host controller. Since both are handled using the interrupt system, it is necessary to decide which of the source should have higher priority. Giving the sensor data the highest priority may

results in losing some of the data send by the Bluetooth module, as the transmission of the sensor data takes twice as much time as receiving one byte data from UART. Missing a single byte data send from the Bluetooth causes the lose of control over the transmission. On other hand, giving the Bluetooth the highest priority will make the DAU stop receiving the sensor data until the host controller finishes its transmission.In this case the interrupted sensor frame shall be discarded. We do not consider the data being send from the DAU to the Bluetooth as it does not affect the operation. Since the missing 1 byte of Bluetooth communication affects the functioning of DAU much more than losing one single sensor data frame, the high priority is obviously given to the Bluetooth link. 4.3 Central System Unit The central system unit constitute the second peer of wireless connection. This box contains the Bluetooth transceiver and PCM codec for voice data transmission. The main functions of the CSU includes maintaining Bluetooth connection, buffering incoming sensor data, performing online data analysis, recording the results for further reference and providing visualization. The main parts which constitute the CSU are connection manager, data logger module, data analysis module, alarm dispatcher module and visualization module. The connection manager forms the front end of the CSU which receives the incoming data. The data logger module buffers the incoming data and the processed information. The data analysis module processes the output of the sensor and identifies the emotion. The incoming data is loaded into the alarm dispatcher module together with the data logger. This module checks the alarm conditions and produce suitable output. The visualization module provides a convenient way for the supervisor to access the database. CSU - intermodule communication C onnection Manager Ope ra tor manager (OM) ager(OM) ... OM OM Separated physiological data streams 1 Data Analysis Visualization Module D ata Logger tr Processed data Recorded (off-line) data The individual units which constitute the CSU are explained below. a. Connection Manager Connection Manager's main task is to perform low-level Bluetooth communication using host Controller Interface commands. It is designed to cooperate with all available Bluetooth devices. Additionally, Connection Manager authorizes operators, manages their sessions, demultiplexes and buffers raw physiological data.

Bluetooth Connection Manager is responsible for establishing and maintaining connections using all available Bluetooth devices. It periodically inquires new devices in an operating range and checks whether they are registered in the system database. Only with those devices, the Connection Manager will communicate. After establishing a connection an authentication procedure occurs. The authentication process is performed using system PIN code fetched from the database. Once the connection has been authenticated the mobile unit sends a data frame containing the operator's identifier. Finally, the Connection Manager adds a SCO link ( voice connection) and runs a new dedicated Operator Manager, which will manage the new operator's session.. b. Operator Manager The data of each supervised operator is buffered separately in the dedicated Operator Manager. At the startup it communicates with the Operator Data Manager in order to get more detailed personal data. The most important Operator Manager's task is to buffer the incoming raw data and to split it into separate data streams related to each of the measured parameters. The raw data is sent to a Logger Module, the split data streams are available for the other system modules through producer-consumer queues. Furthermore, the Operator Manager provides an interface for sending alert messages to the related operator. Operator Data Manager provides an interface to the operator database enabling the other modules to read or write personal data and system access information. c. Data Logger Module The module provides support for storing the monitored data in order to enable the supervisor to reconstruct and analyze the course of the operator's duty. The module registers as a consumer of the data to be stored in the database. Each working operator's data is recorded by a separate instance of the Data Logger. Apart from the raw or processed physiological data, alerts and operator's voice are stored. The raw data is supplied by the related Operator Manager module, whereas the Data Analysis module delivers the processed data. The voice data is delivered by a Voice Data Acquisition module. The module registers as an operator's voice data consumer and optionally processes the sound to be stored (i.e. reduces noise or removes the fragments when the operator does not speak). The Logger's task is to add appropriate time stamps to enable the system to reconstruct the voice. d. Data Analysis Module The module performs the analysis of the raw sensor data in order to obtain information about the operator's physiological condition. The separately running Data Analysis Module supervises each of the working operators. The module consists of a number of smaller analyzers extracting different types of information. Each of the analyzers registers at the appropriate Operator Manager or another analyzer as a data consumer and, acting as a producer, provides the results of the analysis e. Alarm Dispatcher Module Alarm Dispatcher Module is a very important part of the Data Analysis module. It registers for the results of the data analysis, checks them with regard to the user-defined alarm conditions and launches appropriate actions when needed. The module is a producer of the alarm messages, so that they are accessible in the logger and visualization modules. f. Visualization Module The module provides user interface for the supervisors. It enables them to watch each of the working operator's physiological condition along with a preview of selected video source and his related sound stream. All the incoming alarm messages are instantly signaled to the supervisor. Moreover, the visualization module can be set in the off-line mode, where all the data is fetched from the database. Watching all the recorded physiological parameters,

alarms, video and audio data the supervisor is able to reconstruct the course of the selected operator's duty. 5. APPLICATIONS OF BLUE EYES BLUE EYES enables the computer to adapt a working style that fits users personality to increase his productivity The technology can also be incorporated into automobiles and the interactive entertainment such as toys etc. An alarm can be devised in the steering of a vehicle to warn the driver, if his stress level goes beyond the critical level. Also, Blue Eyes is going to find its way in the routines of human beings with application such as TV, washing machine etc. Also if we take case of a company, the supervisor can be given the power to access the database containing his employees profile, which is linked with Blue Eyes technology. Thus he can check the emotional state of any of his worker at any time at his own will. Thereby he can evaluate the performance of his employees. For example, the boss can monitor the saccadic activity of a worker at his night duty time, and if the saccadic activity is smaller, the boss can infer that the worker is sleeping. He can trigger an alarm at this condition or can inform others to wake him up. 6. FUTURE SCOPE We find it possible still to improve our project. The use of a miniature CMOS camera integrated into the eye movement sensor will enable the system to calculate the point of gaze and observe what the operator is actually looking at. Introducing voice recognition algorithm will facilitate the communication between the operator and the central system and simplify authorization process. Future applications of this technology is limitless -from designing cars to controlling your household devices. 7. CONCLUSION The BLUE EYES is the most modern technology, which dealt with giving the computers emotional intelligence. This innovation can make our life so simple that anybody can operate any household devices, sophisticated machines, manufacturing machines in industries which have complicated operating procedures etc without much conscious effort. By the implementation of this technology we can have devices which will do our tasks when we speak to them. We will work with our personal computer which can hear us, speak to us and even scream aloud. This amazing technology will simplify our life by providing more delicate means to operate our devices. Thus the BLUE EYES will find its way to our day-to-day life and will become an integral part of it.

2. 3. 4. http://www.wikipedia.org ,ets ROK101008 Bluetooth Module, Erickson Microelectronics CONTENTS 1. INTRODUCTION .1 2. BIOMETRICS 2 3. MAIN BIOMETRIC TECHNIQUES 3 3.1 FACIAL EXPRESSION DETECTION 3

3.2 SPEECH RECOGNITION 4 3.3 EMOTION MOUSE 4 3.4 EYE GAZE TRACKING 5 3.5 JAZZ MULTISENSOR 5 4. BLUE EYES HARDWARE 8 4.1 SYSTEM OVERVIEW 8 4.2 DATA ACQUISITION UNIT 10 4.3 CENTRAL SYSTEM UNIT 14 5. APPLICATION 18 6. FUTURE SCOPE 19 7. CONCLUSION 20 8. REFERENCE 21

Reference: http://seminarprojects.com/Thread-blue-eyes-download-full-report-andabstract?page=2#ixzz1nThMbgON