UB CSE 666 - Facial Expression Biometrics Using Tracker Displacement Features

Unformatted text preview:

Facial Expression Biometrics Using Tracker Displacement FeaturesSergey Tulyakov1, Thomas Slowe2,ZhiZhang1, and Venu Govindaraju11Center for Unified Biometrics and SensorsUniversity at Buffalo, NY, USA{tulyakov,zhizhang,venu}@cedar.buffalo.edu2CUBRC, Buffalo, NY, [email protected] this paper we investigate a possibility of using the faceexpression information for person biometrics. The idea ofthis research is that person’s emotional face expressions arerepeatable, and face expression features can be used forperson identification. In order to avoid using person spe-cific geometric or textural features traditionally used in facebiometrics, we restrict ourselves to the tracker displacementfeatures only. In contrast to previous research in facial ex-pression biometrics, we extract features only from the pairof face images, neutral and the apex of emotion expression,instead of using the sequence of images from the video. Theexperiments, performed on two facial expression databases,confirm that proposed features can indeed be used for bio-metrics purposes.1. IntroductionThe way person behaves, and, in particular, the way per-son expresses some emotions, can serve as an indicator toperson’s identity. Previously Eckman [3] identified 18 typesof different smiles. The type of smile expressed by the per-son is partly influenced by the current emotional state of theperson and environment. At the same time we can spec-ulate that cultural background of the person and constantpsychological traits can determine the frequencies of eachtype appearance and their strength. If it is so, then the typeof the smile can be used for biometric person identification.Some psychological studies seem to confirm the indi-vidual differences in emotions, and in particular in smiles.Krumhuber et al. [7] provide an extensive overview of re-cent research on differences in smiles and perceptions ofsmiles in the field of psychology. As it is discussed there,the perception of smile strength can be influenced by thegeneral belief or expectation of a person. These r esults im-ply that psychological studies involving human test subjectsmight not give objective picture of smile or expression in-dividuality. On the other hand, it was noted before thatwomen express smile more frequently than men [1] andsuch frequency can be measured rather objectively.In our research we are interested in separating the ex-pressions of different individuals automatically. Besidesproving the individuality of facial expressions, our goal is toachieve an algorithm identifying the persons based on theirexpressions. Since expressions might not provide enoughdiscriminating power, our research might be considered asa soft biometrics [5].The previous research into facial expression biometricsis very limited. Schmidt and Cohn [9] and Cohn et al. [2]describe the system performing biometric person authenti-cation using the changes in action unit appearances. Suchsystem measures the intervals between different phases offacial action units [4] and their sequence order. The ac-tion units and time intervals are measured with the help ofspecial hardware. The reported performance is comparablewith the performance of commercial face biometrics sys-tem. Another interesting research is presented in [8], whereauthors investigate if emotion expression is hereditary. Theexperiments conducted with born-blind subjects seem to in-dicate that face emotion expression of family members iscorrelated. The used f eatures included sequences of 43types of face movements.In both above approaches long sequences of facial move-ments were used to produce a statistically significant per-son identification algorithm. In our research we investi-gate whether the information obtained from single imagescan serve for the same purpose. Similar to these works weconcentrated on the expressions associated with some emo-tions. We assumed that the s ame emotion of the same per-son is expressed as a similar set of action units of the simi-lar intensity. Thus, using a single image of the person at theheight of emotion expression might be sufficient for persondiscrimination.1Figure 1. Tracker points and extracted reference frame used fornormalization.2. Tracker Displacement FeaturesThe main difficulty that we had in our research was toproduce features which are unique to face expressions anddo not contain information traditionally used for face bio-metrics. For example, we did not want to use geometricdistances between different points in face, or texture infor-mation which might give strong information about the per-son. Such information might be irrelevant for face expres-sions, and our results would greatly skewed if such featuresare used.Thus we decided to restrict ourselves in using only dis-placement distances between tracker points. The usedtracker positions are described in [10] and illustrated in Fig-ure 1. Though the tracker positions can be extracted auto-matically [10], we found out that algorithm does not workquite reliably for single frame images or very short videosequences that we used. Thus, w e used the help of humanexpert to find the tracker points.To find displacements we had to use two images of aperson - one neutral emotion expression and one at the apexof emotion. The distances between corresponding trackerpoints are indicative on the movement of the face, but notof the face geometrical form.In order to account for face global transformation in theimage frame, we searched for a frame of reference shownin Figure 1. The vertical axis and the direction of horizontalaxis were found by using regression on the sums of cor-responding left and right tracker points. The location ofhorizontal axis was found by averaging vertical positions ofsome stable points - tracker points of eyes and nose. Thepositions of tracker points were also scaled to account dif-ferent distances from face to the camera.After normalization procedures in both neutral and emo-tion images, the displacements between correspondingtracker positions was recorded. The final result is 116 fea-tures corresponding to x and y coordinates of 58 trackerpoints. These features seem to not contain any geomet-ric face information, but only the information about emo-tion expression. Though it is possible that some type offace geometry is favorable to specific movements of trackerpoints, we regard this very improbable.Two images, neutral and emotion, are used to constructa single feature vector of a person


View Full Document

UB CSE 666 - Facial Expression Biometrics Using Tracker Displacement Features

Download Facial Expression Biometrics Using Tracker Displacement Features
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Facial Expression Biometrics Using Tracker Displacement Features and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Facial Expression Biometrics Using Tracker Displacement Features 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?