DOC PREVIEW
USC CSCI 534 - helmut-um-05

This preview shows page 1-2-3 out of 10 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Recognizing, Modeling, and Responding toUsers’ Affective StatesHelmut Prendinger1, Junichiro Mori2, and Mitsuru Ishizuka21National Institute of Informatics2-1-2 Hitotsubashi, Chiyoda-ku, Tokyo 101-8430, [email protected]. of Information and Communication Engineering, University of Tokyo7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, [email protected]. We describ e a system that recognizes physiological data ofusers in real-time, interprets this information as affective states, and re-sp onds to affect by employing an animated agent. The agent assumesthe role of an Empathic Companion in a virtual job interview scenariowhere it accompanies a human interviewee. While previously obtained re-sults with the companion with were not significant, the analysis reportedhere demonstrates that empathic feedback of an agent may reduce userarousal while hearing interviewer questions. This outcome may proveuseful for educational systems or applications that induce user stress.1 IntroductionComputers sensing users’ physiological activity are becoming increasingly pop-ular in the human–computer interface and user modeling communities, partlybecause of the availability of affordable high-specification sensing technologies,and also due to the recent progress in interpreting physiological states as affectivestates or emotions [10]. The general vision is that if a user’s emotion could berecognized by the computer, human–computer interaction would become morenatural, enjoyable, and productive. The computer could offer help and assistanceto a confused user or try to cheer up a frustrated user, and hence react in waysthat are more appropriate than simply ignoring the user’s affective state as isthe case with most current interfaces.Our particular interest concerns interfaces that employ animated or embodiedagents as interaction partners of the user. By emulating multi-modal human–human communication and displaying social cues including (synthetic) speech,communicative gestures, and the expression of emotion, those agents may triggersocial reactions in users, and thus implement the “computers as social actors”metaphor [14]. This type of social and affect-aware interface has been demon-strated to enrich human–computer interaction in a wide variety of applications,including interactive presentations, training, and sales [2, 12].In this paper, we propose an interface that obtains information about a user’sphysiological activity in real-time and provides affective feedback by means of anembodied agent. The interface is intended to respond to the user’s emotion byshowing concern about user affect, sometimes called empathic (or sympathetic)behavior. Empathic interfaces may leave users less frustrated in the case of astressful event related to the interaction [5]. Potential application fields includesoftware (assuming unavoidable software-related failures), computer-based cus-tomer support, and educational systems. The web-based (virtual) job interviewscenario described here serves as a simple demonstrator application that allowsus to discuss the technical issues involved in real-time emotion recognition aswell as the implementation of an empathic agent. In this paper, we will extendand complement our previous investigations on empathic agents.– Virtual Quizmaster. An agent providing empathic feedback to a deliberatelyfrustrated user can significantly reduce user arousal or stress when comparedto an agent that ignores the user’s frustration [13].– Empathic Companion. An empathic agent has no overall positive effect onthe user’s interaction experience in terms of lower levels of arousal [11].The rest of this paper is organized as follows. In Sect. 2, we describe relatedwork. Section 3 is dedicated to introducing the Empathic Companion. There, wefirst describe our system for real-time emotion recognition, and then explain howphysiological signals are mapped to named emotions. The final part of Sect. 3 dis-cusses the decision-theoretic agent that is responsible for selecting the EmpathicCompanion’s actions. In Sect. 4, we illustrate the structure an interaction withthe Empathic Companion in the setting of a virtual job interview, and providenew results of an experiment that recorded users’ physiological activity duringthe interaction. Section 5 concludes the paper.2 Related WorkThere are various research strands that share the methodology and motivation ofour approach to affective and empathic interfaces. The tutoring system developedby Conati [3] demonstrates that the user’s physiological state can play a keyrole in selecting strategies to adapt an educational interface. When the user’sfrustration is detected, an interface agent can try to undo the user’s negativefeeling. Bickmore [1] investigates empathic agents in the role of health behaviorchance assistants that are designed to develop and maintain long-term, social-emotional relationships with users, so-called ‘relational agents’.The investigation of Klein et al. [5] is most closely related to our work onempathic interfaces. They describe the design and evaluation of an interface im-plementing strategies aimed at reducing negative affect, such as active listening,empathy, sympathy, and venting. The resulting affect–support agent used in asimulated network game scenario could be shown to undo some of the us ers’negative feelings after they have been deliberately frustrated by simulated net-work delays inserted into the course of the game. The Emphatic Companioninterface differs from the one used in [5] in two aspects. First, the user in oursystem is given feedback in a more timely fashion, i.e. shortly after the emotionFig. 1. Job Interview Scenario.actually occurs, and not after the interaction session, in response to the sub-ject’s questionnaire entries. While providing immediate response to user affectis certainly preferable in terms of natural interaction, it assumes that affect isprocessed in r eal-time. Hence, in order to assess a user’s emotional state online,we implemented a system that takes physiological signals of the user during theinteraction with the computer.Second, affective feedback to the user is communicated by means of an em-bodied agent, rather than a text message. Although the study of Klein and co-workers [5] supports the argument that embodiment is not necessary to achievesocial response, it has been shown that embodied characters may boost the ten-dency of


View Full Document

USC CSCI 534 - helmut-um-05

Download helmut-um-05
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view helmut-um-05 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view helmut-um-05 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?