DOC PREVIEW
An Interactive Virtual Reality Platform for Studying Embodied Social Interaction

This preview shows page 1 out of 4 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 4 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 4 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

An Interactive Virtual Reality Platform for Studying Embodied Social InteractionHui Zhang ([email protected])Department of Computer Science; Indiana UniversityChen Yu ([email protected])Department of Psychology and Brain Sciences; Indiana UniversityLinda B. Smith ([email protected])Department of Psychology and Brain Sciences; Indiana UniversityAbstractWe present an interactive virtual reality platform for studyingthe role of embodied social interaction in the context of lan-guage learning. The virtual environment consists of virtualobjects, a virtual table, and most importantly, a set of virtualstudents with different social-cognitive skills. Real users areasked to serve as language teachers and teach virtual learnersobject names. They can interact with virtual learners via gaz-ing, pointing at and moving virtual objects as well as speechacts. Since both the virtual environment (what users see) andthe virtual humans (whom users interact with) are controlled(pre-programmed), this provides a unique opportunity to studyhow real teachers perceive different social signals generatedby virtual learners and how they adjust their behaviors accord-ingly. One primary result is that real people feel comfortableto interact with virtual humans in the virtual environment andtreat them as social partners. Moreover, the platform allows usto record real people’s multimodal behavioral data and analyzethe data across individual participants to extract shared behav-ioral patterns. Overall, this work demonstrates the usefulnessof virtual reality technologies in studying both human-humanand human-machine social interactions.IntroductionA better understanding of human-human interaction in lan-guage learning has long been a subject of fascination. Lan-guage learning is a social event between teachers and learn-ers. Nonverbal communication, including body language,gaze, gesture, facial expression, is crucial for both smoothcommunication and effective learning. More specifically,body language signaled by a language teacher provides usefulcues for a language learner to infer what the speaker intendsto refer to in unknown (yet) language. For example, a deic-tic pointing action would single out one object from multipleones in a natural scene and indicate the speaker’s referen-tial intentions [14]. Meanwhile, body language signaled bya language learner indicates his/her attentional state so thatthe language teacher can adjust behaviors accordingly to en-hance interaction and learning. For instance, if the languageteacher realizes that the learner is not engaged in the interac-tion, she would generate some actions to attract the learner’sattention. On the other hand, if the learner is fully engaged,then the teacher would focus more on using body language tofacilitate language learning (but not on engaging the languagelearner).Although previous research demonstrates the importance ofsocial cues in the laboratory environment [1], quantitativeanalyses of the role of social cues in real world is very dif-ficult without interfering with the interaction itself. What isreally needed is an approach to controlling dynamic interac-tions between the language teacher and the language learner.By doing so, we can decouple the social interactions betweentwo agents and manipulate the parameters in the interactiondynamically and systematically in a well-controlled way. Thepresent paper addresses this challenge by using state-of-arttechnologies in computer graphics and virtual reality.In the past decade, applications of virtual reality (VR) tech-nology have been rapidly developed with the advance of com-puter graphics software and hardware. Virtual Reality tech-niques provide a unique way to enable people to interactefficiently with 3D computerized characters in a computer-rendered environment in real time using their natural sensesand skills. Recently there is a growing trend that VR can playan important role in basic research in a variety of disciplinesincluding cognition[2], education [9, 4] and perception[11].Among others, Jasso and Triesch presented a virtual real-ity platform for developing and evaluating embodied modelsof cognitive development in [6]. Turk et al.[13] introduceda paradigm for studying multimodal and nonverbal commu-nication in collaborative virtual environment where a user’scommunication behaviors can be filtered and re-rendered in aVR environment to change the nature of social interaction.In light of this, we present a new experimental paradigm thatexploits VR technologies to decouple complex social interac-tions between two agents and to study the role of embodiedsocial cues in language learning. Specifically, we hypothesizethat naturalistic social influence can occur within immersivevirtual environments as a function of two additive factors, be-havioral realism and social presence. This paper takes thefirst steps towards this goal by designing and implementing anovel interactive virtual reality platform by asking real usersto interact with virtual humans through various embodied so-cial interactions. We report a case study of using this virtualreality platform with the evaluations of this platform in thecontext of a language learning task.System FrameworkOverviewWe build virtual humans equipped (pre-programmed) withdifferent kinds of social cognitive skills and ask real peopleto interact with virtual humans in a virtual environment.Our VR interaction system consists of four components asshown in Figure 1:Real HumanPointMoveGazePerceptionObjectsActionActionExpressionsSpeakPerceptionVirtual HumanVirtual EnvironmentStaticDynamicObjects LipMotionFacialWalkPointGazeFigure 1: Overview of system architecture.• A virtual environment includes a virtual laboratory withfurniture and a set of virtual objects that real people canmanipulate in real time via a touch screen mounted on acomputer screen.• Virtual humans can demonstrate different kinds of socialskills and perform actions in the virtual environment.• Multimodal interaction between virtual humans and realpeople includes speaking, eye contact, pointing at, gazingat and moving virtual objects.• Data recording monitors and records a participant’s bodymovements including pointing and moving actions on vir-tual objects, eye gaze, and speech acts in real time.Building Virtual HumansAppearance and Behavior One of the most important is-sues in our design is the “behavioral realism” of the vir-tual agents, which means that virtual


An Interactive Virtual Reality Platform for Studying Embodied Social Interaction

Download An Interactive Virtual Reality Platform for Studying Embodied Social Interaction
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view An Interactive Virtual Reality Platform for Studying Embodied Social Interaction and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view An Interactive Virtual Reality Platform for Studying Embodied Social Interaction 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?