1,389
13
Essay, 8 pages (2000 words)

Abstract— of blue eyes technology is to give

Abstract— Facial expressionsand emotions plays an important role in communications in social interactionswith other human beings which delivers rich information about their mood.

The” BLUE EYES TECHNOLOGY” aims at creating computational machines that havesensory and perceptual abilities like those of human beings which enables thecomputer to gather information about humans and interact with them. This paperimplements the detection of emotions (happy, sad, fear, surprised, anger, disgust) by taking in consideration the human eye expressions and by usingemotion mouse. The emotion mouse obtains physiological data andemotional state of a person through the single touch of mouse having differentsensors. Emotions are also determined by human eye expression in which the eyeregion from a video sequence is analyzed. From the different frames of thevideo stream, the human eyes can be extracted using the edge operator and thencan be classified using a Support Vector machine (SVM) classifier. After theclassification we use standard learning tool, Hidden Markov Model (HMM) for recognizingthe emotions from the human eye expressions. After successful detectionof emotion, suitable audio track will be played. Keywords- Blue eyes, emotion mouse, emotionrecognition, eye expressions, SupportVector Machine (SVM), HiddenMarkov Model (HMM).

I.         INTRODUCTIONThe” BLUE EYES” technology aims at creating computational machines by adding extraordinaryperceptual abilities to the computers that helps them to verify a person’sidentity, feel their presence, and interact with them. Human recognition dependsprimarily on the stability to perceive, interpret, and integrate audio/visualsand sensoring information, Blue eyes technology makes a computer to sense andunderstand human feelings and their behavior and enables the computer torespond according to the sensed emotional level. The main aim of blue eyestechnology is to give human abilities or power to a computer, so that themachine can naturally interact with human beings as humans interact with eachother.

Theproposed methodologies in this paper detect human emotions are emotional mouseand emotion recognition by human eye expressions. Emotion mouse is an inputdevice which is designed to track the emotions of a user by a simple touch ofit. The emotion mouse is used to evaluate and identify the user’s emotions suchas happy, sad, anger, fear, disgust, surprised, etc.

when the user isinteracting with computer. Human’s emotion recognition isan important component for efficient man-machine interaction. It plays acritical role in communication by allowing people to express oneself beyond theverbal domain. Analysis of emotions from human eye expression involves thedetection and categorization of various human emotions or different state ofmind. For example, in security and surveillance, they can predict the offenderor criminal’s behavior by analyzing the images of their face from the frames ofthe video sequence. The analysis of human emotions can be applied in a varietyof application domains, such as video surveillance and human – computerinteraction systems.

In some cases, the results of such analysis can be appliedto identify and categorize the various human emotions automatically from thevideos.                                                                                                                                                  II.        RELATED WORKManyapproaches for blue eye technology and human emotion recognition have beenproposed in the last two decades. MiznaRehman Mizna et.

al. 1 This paperimplements a new technique known as Emotion Sensory World of Blue EyesTechnology which identifies human emotions (sad, happy, excited or surprised)using image processing techniques by extracting eye portion from the capturedimage which is then compared with stored images of data base. This paperproposes two key results of emotional sensory world. First, observation revealsthe fact that different eye colors and their intensity results in change inemotions. It changes without giving any information on shape and actualdetected emotion.

It is used to successfully recognize four different emotionsof eyes. S. R. Vinothaet. al. 2, this paper uses the feature extraction technique to extract theeyes, support vector machine (SVM) classifier and a HMM to build a humanemotion recognition system.

The proposed system presents a human emotion recognition system thatanalyzes the human eye region from video sequences. From the frames of thevideo stream the human eyes can be extracted using the well-known canny edgeoperator and classified using a non – linear Support Vector machine (SVM)classifier. Finally, standard learning tool is used, Hidden Markov Model (HMM)for recognizing the emotions from the human eye expressions. Mohammad Soleymani et. al. 3 this paper presents the approachin instantaneously detecting the emotions of video viewers’ emotions fromelectroencephalogram (EEG) signals and facial expressions. A set of emotioninducing videos were shown to participants while their facial expressions andphysiological responses were recorded.

The expressed valence (negative topositive emotions) in the videos of participants’ faces were annotated by fiveannotators. The stimuli videos were also continuously annotated on valence andarousal dimensions. Long-short-term-memory recurrent neural networks (LSTM-RNN)and Continuous Conditional Random Fields (CCRF) were utilized in detectingemotions automatically and continuously. The results from facial expressions tobe superior to the results from EEG signals. The analyzed effect of thecontamination of facial muscle activities on EEG signals and found that most ofthe emotionally valuable content in EEG features are as a result of thiscontamination. However, our statistical analysis showed that EEGsignals carries complementary information in presence of facial expressions.

T. Moriyama et. al. 4 thispaper propose a systemthat is capable of detailed analysis of eye region images in terms of theposition of the iris, degree of eyelid opening, and the shape, complexity, andtexture of the eyelids.

The system uses a generative eye region model thatparameterizes the fine structure and motion of an eye. The structure parametersrepresent structural individuality of the eye, including the size and color ofthe iris, the width, boldness, and complexity of the eyelids, the width of thebulge below the eye, and the width of the illumination reflection on the bulge. The motion parameters represent movement of the eye, including the up-downposition of the upper and lower eyelids and the 2D position of the iris. Renu Nagpal et. al. 5 the main contribution of this paper is topresent a first in the world publicly available dataset of labeled datarecorded over the Internet of people naturally viewing online media. The AM-FEDcontains, 1) 242 webcam videos recorded in real-world conditions, 2) 168, 359frames labeled for the presence of 10 symmetrical FACS action units, 4asymmetric (unilateral) FACS action units, 2 head movements, smile, generalexpressiveness, feature tracker fails and gender, 3) locations of 22automatically detect landmark points, 4) baseline performance of detectionalgorithms on this dataset and baseline classifier outputs for smile. 5)Self-report responses of familiarity with, liking of and desire to watch againfor the stimuli videos.

This represents a rich and extensively coded resourcefor researchers working in the domains of facial expression recognition, affective computing, psychology and marketing. The videos in this dataset wererecorded in real-world conditions. In particular, they exhibit non-uniformframe rate and non-uniform lighting. The camera position relative the viewervaries from video to video and in some cases the screen of the laptop is theonly source of illumination.

The videos contain viewers from a range of agesand ethnicities some with glasses and facial hair. The dataset contains a largenumber of frames with agreed presence of facial action units and other labels.                                                                                                                                        III.       METHODOLOGYUSEDA.   EmotionRecognition From Human EyesFacial expressions play anessential role in communications in social interactions with other human beingswhich delivers information about their emotions. The most crucial feature ofhuman interaction that grants naturalism to the process is our ability to inferthe emotional states of others. Our goal is to categorize the different humanemotions from their eye expressions.

The proposed system presents a humanemotion recognition system that analyzes the human eye region from videosequences. From the frames of the video stream the human eyes can be extractedusing the well-known canny edge operator and classified using a non – linearSupport Vector machine (SVM) classifier. Finally, a standard learning tool isused, Hidden Markov Model (HMM) for recognizing the emotions from the human eyeexpressions.              Surprised                                        Sad                 Happy                                         Anger                       Fear                                          Disgust Fig. 1: Sample eyeexpressionsHuman emotionrecognition is an important component for efficient human – computerinteraction.

It plays a critical role in communication, allowing people toexpress themselves beyond the verbal domain. Analysis of emotions from humaneye expression involves the detection and categorization of various humanemotions and state of mind. The analysis of human emotions can be applied in avariety of application domains, such as video surveillance and human – computerinteraction systems.

In some cases, the results of such analysis can be appliedto identify and categorize the various human emotions automatically from thevideos. The six primary or main types of emotions are shown in Fig. 1: surprised, sad, happy, anger, fear, disgust. Our method is to use the featureextraction technique to extract the eyes, support vector machine (SVM)classifier and a HMM to build a human emotion recognition system.                                             Themethodology of emotionrecognition from human eye expression is shown in Fig.

2. In thismethodology image of the user sitting in front of the camera is captured. Thenimage representing a set of frames is preprocessed and a noise free image isobtained.

The noise free image is edge detected using Canny Edge Operator. Using the feature extraction process, the eye regions are extracted from theresultant edge detected image. The extracted eye regions are classified usingSVM classifier. Finally, the corresponding emotions are recognized. B.   EmotionMouseOneproposed, non-invasive method for gaining user information through touch is viaa computer input device, the mouse. This then allows the user to relate thecardiac rhythm, the body temperature and other physiological attributes with the mood.

Fig. 3: Block Diagramof Emotion Mouse The block diagram of emotion mouse is shownin Fig. 3, this device can measure heart rate and temperature and matches themwith six emotional states: happiness, surprise, anger, fear, sadness anddisgust.   The mouse includes a set ofsensors, including infrared detectors and temperature-sensitive chips. Thesecomponents can also be crafted into other commonly used items such as theoffice chair, the steering wheel, the keyboard and the phone handle. Integrating the system into the steering wheel, for instance, could allow analert to be sounded when a driver becomes drowsy.

Heart rateis taken by IR on the thumb and temperature is taken using a thermistor chip. These values are input into a series of discriminate function analyses andcorrelated to an emotional state. Specifically, for the mouse, discriminatefunction analysis is used in accordance with basic principles to determine abaseline relationship, that is, the relationship between each set ofcalibration physiological signals and the associated emotion.                                                                                                                                                  IV.

SYSTEMMODEL In this system, two methodologies namelyemotion mouse and emotion recognition from eye expression are used. Emotionmouse will consider the physiological as well as biological parameters such ascardiac rhythm and body temperature, whereas on the other side emotionrecognition from human eye expression considers facial expression for thedetection of human emotion and mood. Fig. 4: Block diagramof the systemFig.

4 shows theblock diagram of the system. In this system the data from the heartbeat sensorand temperature sensor of the emotion mouse is given to the microcontroller. The output of the microcontroller is then fed to the computer. The value ofheartbeat sensor and temperature sensor is compared with the standard range ofeach emotion and the suitable emotion is selected on the other hand a webcam isconnected with the computer which will take the image of the person from avideo sequence and will further recognize the emotion by detecting the eyepart. The captured eye section will be compared to the images stored indatabase to detect mood of the person. After detecting the mood, the musicoraudio command is played according to the detected mood.

V.        RESULTIn proposed system, there are two results ofthe mentioned methodologies. Firstly, different eye expressions of the differentpeople are taken in consideration by edge detection of eyes. Further each eyeexpression is categorized into a given set of emotions (happy, sad, fear, surprised, disgust, anger} to take in account a single standard expression foreach emotion. Thus emotion of a person can be detected by comparing the eyeexpression of the person with the standard eye expressions of each emotion.

Secondly, the values of heartbeat sensor and temperature sensor are comparedwith the standard value range of each emotion and accordingly the value rangeof a emotion that matches with the data values of the user is considered as theemotional state of the user. According to the detected emotion the music oraudio command is played.                                                                                                                                                      VI.       CONCLUSIONRecent research documents tellthat the understanding and recognition of emotional expressions plays a veryimportant role in the maintenance and development of social relationships.

Thispaper gives an approach of creating computational machines that have perceptualand sensory ability like those of human beings which enables the computer togather information about you through special techniques like facial expressionsrecognition and considering biological factors such as cardiac rhythm and bodytemperature. This makes it possible for computer and machines to detect theemotion of the human and respond to it.

Thank's for Your Vote!
Abstract— of blue eyes technology is to give. Page 1
Abstract— of blue eyes technology is to give. Page 2
Abstract— of blue eyes technology is to give. Page 3
Abstract— of blue eyes technology is to give. Page 4
Abstract— of blue eyes technology is to give. Page 5
Abstract— of blue eyes technology is to give. Page 6
Abstract— of blue eyes technology is to give. Page 7
Abstract— of blue eyes technology is to give. Page 8
Abstract— of blue eyes technology is to give. Page 9

This work, titled "Abstract— of blue eyes technology is to give" was written and willingly shared by a fellow student. This sample can be utilized as a research and reference resource to aid in the writing of your own work. Any use of the work that does not include an appropriate citation is banned.

If you are the owner of this work and don’t want it to be published on AssignBuster, request its removal.

Request Removal
Cite this Essay

References

AssignBuster. (2022) 'Abstract— of blue eyes technology is to give'. 28 September.

Reference

AssignBuster. (2022, September 28). Abstract— of blue eyes technology is to give. Retrieved from https://assignbuster.com/abstract-of-blue-eyes-technology-is-to-give/

References

AssignBuster. 2022. "Abstract— of blue eyes technology is to give." September 28, 2022. https://assignbuster.com/abstract-of-blue-eyes-technology-is-to-give/.

1. AssignBuster. "Abstract— of blue eyes technology is to give." September 28, 2022. https://assignbuster.com/abstract-of-blue-eyes-technology-is-to-give/.


Bibliography


AssignBuster. "Abstract— of blue eyes technology is to give." September 28, 2022. https://assignbuster.com/abstract-of-blue-eyes-technology-is-to-give/.

Work Cited

"Abstract— of blue eyes technology is to give." AssignBuster, 28 Sept. 2022, assignbuster.com/abstract-of-blue-eyes-technology-is-to-give/.

Get in Touch

Please, let us know if you have any ideas on improving Abstract— of blue eyes technology is to give, or our service. We will be happy to hear what you think: [email protected]