GT LCC 3710 - An Introduction to Interactive Sonification

Unformatted text preview:

The research field of sonification, asubset of the topic of auditory dis-play, has developed rapidly in recentdecades. It brings together interestsfrom the areas of data mining, exploratory dataanalysis, human–computer interfaces, and com-puter music. Sonification presents informationby using sound (particularly nonspeech), so thatthe user of an auditory display obtains a deeperunderstanding of the data or processes underinvestigation by listening.1We define interactive sonification as the use ofsound within a tightly closed human–computerinterface where the auditory signal providesinformation about data under analysis, or aboutthe interaction itself, which is useful for refiningthe activity.Here we review the evolution of auditory dis-plays and sonification in the context of comput-er science, history, and human interaction withphysical objects. We also extrapolate the trendsof the field into future developments of real-time,multimodal interactive systems.Multimodal data analysisAs computers become increasingly prevalentin society, more data sets are being collected andstored digitally, and we need to process these inan intelligent way. Data processing applicationsrange from analyzing gigabytes of medical datato ranking insurance customers, from analyzingcredit card transactions to the problem of moni-toring complex systems such as city traffic or net-work processes. For the newer applications, thedata often have a high dimensionality. This hasled to two trends:❚ the development of techniques to achievedimensionality reduction without losing theavailable information in the data, and❚ the search for techniques to represent moredimensions at the same time. Regarding the latter point, auditory displays offeran interesting complement to visual displays. Forexample, an acoustic event (the audio counter-part of the graphical symbol) can show variationin a multitude of attributes such as pitch, modu-lations, amplitude envelope over time, spatiallocation, timbre, and brightness simultaneously.Human perception, though, is tuned toprocess a combined audiovisual (and often alsotactile and olfactory) experience that changesinstantaneously as we perform actions. Thus wecan increase the dimensionality further by usingdifferent modalities for data representation. Themore we understand the interaction of these dif-ferent modalities in the context of human activ-ity in the real world, the more we learn whatconditions are best for using them to present andinteract with high-dimensional data.Interacting with musical interfacesThroughout history humankind has devel-oped tools that help us shape and understand theworld. We use these in a close action-perceptionloop, where physical interaction yields continu-ous visual, tactile, and sonic feedback. Musicalinstruments are particularly good examples of sys-tems where the acoustic feedback plays an impor-tant role in coordinating the user’s activities. The development of electronic musical instru-ments can shed light on the design process forhuman–machine interfaces. Producing an elec-tronic instrument requires designing both theinterface and its relationship to the sound source.This input-to-output mapping is a key attributein determining the success of the interaction. Infact, Hunt, Paradis, and Wanderley2have shownthat the form of this mapping determineswhether the users consider their machine to bean instrument. Furthermore, it can allow (or not)the user to experience the flow3of continuousand complex interaction, where the consciousmind is free to concentrate on higher goals andfeelings rather than the stream of low-level con-20 1070-986X/05/$20.00 © 2005 IEEE Published by the IEEE Computer SocietyAn Introductionto InteractiveSonificationThomas HermannBielefeld University, GermanyAndy HuntUniversity of York, UKGuest Editors’ Introductiontrol actions needed to operate the machine.Acoustic instruments require a continuousenergy input to drive the sound source. This neces-sity for physical actions from the human playerhas two important side effects: it helps to contin-uously engage the player in the feedback loop, andit causes continuous modulation of all the avail-able sound parameters because of the complexcross-couplings that occur in physical instruments.We can speculate whether this theory can beextrapolated to the operation of all computer sys-tems. Maybe because these are so often driven bychoice-based inputs (menus, icons, and so on) thatrely on language or symbolic processing ratherthan physical interaction, we have a world of com-puters that often fails to engage users in the sameway as musical instruments.Another important aspect to consider is natu-ralness. In any interaction with the physicalworld, the resulting sound fed back to the user isnatural in the sense that it reflects a coherentimage of the temporal evolution of the physicalsystem. The harder a piano key is hit, the louderthe note (and its timbre changes also in a knownway). Such relations are consistent with everydayexperience, which means that people everywherewill inherently understand the reaction of a sys-tem that behaves in this way.We argue that an interactive sonification sys-tem is a special kind of virtual musical instru-ment. It’s unusual in that its acoustic propertiesand behavior depend on the data under investi-gation. Also, it’s played primarily to learn moreabout the data, rather than for musical expres-sion. Yet it’s one that will benefit from the knowl-edge and interaction currency that humans havebuilt up over thousands of years of developingand performing with musical instruments.Interactive sonification techniquesThe simplest auditory display conceptually isthe auditory event marker, a sound that’s playedto signal something (akin to a telephone ring).Researchers have developed the techniques ofauditory icons and earcons for this purpose,1yetthey’re rarely used to display larger or completedata sets. Auditory icons and earcons are fre-quently used as direct feedback to an activity,such as for touching a number on an ATM key-pad or the sound widgets in computer interfaces.The feedback usually isn’t continuous but con-sists of discrete events.Another common sonification technique isaudification, where a data series is converted tosamples of a sound signal. Many of the resultingsounds are played back without interruption,rather like listening to a CD track, and there’s nointeraction with the sound. We can, however,turn audification


View Full Document
Download An Introduction to Interactive Sonification
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view An Introduction to Interactive Sonification and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view An Introduction to Interactive Sonification 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?