DOC PREVIEW
Inferring Horizontal Device Orientation from an Accelerometer Signal

This preview shows page 1 out of 2 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 2 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 2 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Which way am I facing: Inferring horizontal device orientation from anaccelerometer signalKai Kunze, Paul LukowiczEmbedded Systems LabUniversity PassauPassau, [email protected] Partridge, Bo BegolePalo Alto Research Center3333 Coyote Hill RoadPalo Alto, CA 94304kurt/[email protected] present a method to infer the orientation of mobiledevice carried in a pocket from the acceleration signalacquired when the user is walking. Whereas previous workhas shown how to determine the the orientation in thevertical plane (angle towards earth gravity), we demonstratehow to compute the orientation within the horizontal plane.To validate our method we compare the output of ourmethod with GPS heading information when walking in astraight line. On a total of 16 different orientations andtraces we have a mean difference of 5 degrees with 2.5degrees standard deviation.1. IntroductionThe work described in this paper is part of a larger effortof our group to facilitate the use of standard sensor enableddevices such as a mobile phone for context recognition. Amajor issue that has to be considered is the fact that userscarry such devices in a variety of ways. In general neither thelocation (pocket, belt, bag) nor the orientation of the devicecan be assumed to be known. In previous work we havedemonstrated how to detect where on the body the device islocated by analyzing the accelerometer signal during variousactivities [3]. We have also shown how to deal with sensordisplacement [4]. In this paper we show how to infer theorientation of the device with respect to the user’s body.We extend existing work by [5] that has shown how theorientation in the vertical plane (angle towards gravity) canbe computed by inferring the orientation in the horizontalplane. Our method is based on the observation that whilewalking, the most variations in the horizontal plane of theacceleration signal will be parallel to the direction of motion.To distinguish between front and back we look at the integralof the signal over time.We focus on the trouser pocket, as it is by far the mostlikely placement for peoples mobile phones(see [2]). Thiswork is partially inspired by [1].first pca component gravity vector x y z Figure 1. Accelerometer coordinate system in relationto the gravity vector (vertical component) and the walk-ing direction infered using pca.2. ApproachWe base the approach on three assumptions:• The user walks facing forward.• The device is placed in a trouser’s pocket. Our approachshould also yield similar results on the persons torso orsimilar on-body placements. However, we did not testthem in this paper.• We apply our approach on a walking segment in whichthe user walked fairly straight.As already described by Mizel [5], one can estimate thegravity vector component of a 3-axis accelerometer. We usea slight variation of this method to get the acceleration axisparallel to the persons torso: We apply a sliding window overover all 3 axis. If the variance of all axis is close to 0 and themagnitude approaches 9.81 m/s2, the signal is very likelyto be dominated by the vertical orientation component.Using this heuristic, we infer the vertical component. Nowwe project the accelerometer signal in the plane perpen-dicular to the vertical gravity vector (=horizontal plane).We apply principle component analysis on the projecteddata points (see Figure 1) to get the direction where theacceleration variations is greatest. This is the axis that isparallel to the walking direction. Assuming that the user iswalking forward, integration over the component will allowsus to determine which way is front (leads to positive integral)2009 International Symposium on Wearable Computers978-0-7695-3779-5/09 $25.00 © 2009 IEEEDOI 10.1109/ISWC.2009.331492009 International Symposium on Wearable Computers1550-4816/09 $25.00 © 2009 IEEEDOI 10.1109/ISWC.2009.331492009 International Symposium on Wearable Computers1550-4816/09 $26.00 © 2009 IEEEDOI 10.1109/ISWC.2009.33149Authorized licensed use limited to: Univ of Calif Irvine. Downloaded on May 18,2010 at 20:05:10 UTC from IEEE Xplore. Restrictions apply.Figure 2. The Mtx motion sensor in the phone casingplaced in the pocket depicted with the different axisorientations.and which is back (see Figure 3).3. Experimental SetupWe test the approach described above with followingsetup. We use a MTx motion sensor (equipped with 3-axis accelerometer, gyro and magnetic field) with a custombluetooth sender placed in a mobile phone casing as datasource for our algorithm. As reference we use a GPS deviceWe stream all data to a Nokia N810 running the contextrecognition network toolbox1for recording and labeling.Note that in the MTx sensor the magnetic field sensor axesare oriented in parallel to the acceleration sensor axes. Thismeans that if our algorithm can infer the orientation of theacceleration axis with respect to the user’s body we automat-ically have the orientation of the magnetic field sensor withrespect to the body. This is equivalent to knowing which way(in terms of longitude and latitude) that the user is facing. Ifthe user is walking forward this direction is also the user’sheading. By comparing the heading computed this way withthe heading provided by GPS we can verify the accuracy ofour method for determining the horizontal orientation of thesensor.Two test subjects walked a straight path outside (around30 meters) with the MTx sensor in the right trouser pocketand the GPS in hand. We repeat this experimental setup 8times per person changing the sensor orientation each time.We pick only 8, as comparing the device with a mobilephone, users will most likely no place the device with thethin side facing towards the body. So we place the phonecasing with the sensor in the right trousers facing frontrotating it always 90 degrees for the following experimentaltrial (the same procedure with the backside facing front),leaving us with 8 distinct orientations.Three of the 8 orientations are shown in Figure 2.4. ResultsUsing the approach presented above, we can reliably de-tect the side of the sensor facing in walking direction for all1. http://crnt.sf.netFigure 3. The accumulated integrated velocity of thefirst pca component direction for 2 trials with differentsensor orientations, one in which the test subject waswalking slowly (blue) and fast (green).! "#$ % &! &"'(()(*+,*-'.(''/Figure 4. The errors between the accelerometer andgps based approaches, mean at


Inferring Horizontal Device Orientation from an Accelerometer Signal

Download Inferring Horizontal Device Orientation from an Accelerometer Signal
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Inferring Horizontal Device Orientation from an Accelerometer Signal and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Inferring Horizontal Device Orientation from an Accelerometer Signal 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?