DOC PREVIEW
UCSD CSE 190 - Object Tracking for Remote Sighted Guide Interface

This preview shows page 1 out of 2 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 2 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 2 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

1 Abstract Remote Sighted Guide (RSG) Interface on GroZi (Grocery Shopping Assistant for the Blind / Visually Impaired) was built through my study under TIES Summer 2009 program. In this CSE-190a course, I am planning to study the effectiveness to apply an object tracking algorithm for RSG Interface. Specifically, I will test a way to observe the direction and distance of motion by detecting edge line or aspect of display shelves in grocery stores. 1. Introduction UCSD Computer Vision Lab has been working on GroZi projects [1] [2]. The GroZi is a solution for visually impaired to enjoy an individual shopping for groceries without the aid of another person. The GroZi projects are making steady progress under TIES (Teams In Service Engineering) program, UCSD. The GroZi projects include a series of assistive technologies using image processing, object recognition, and text to speech notification. I participated in the project in this summer, designed and implemented Remote Sighted Guide Interface (See Fig 1.). The RSG interface is software for guides who support visually-impaired shopping from a remote area. RSG (the sighted user) sets the location of the item to buy on the camera image, and inputs the location of the user's hand. It indicates to the sighted user and the blind user the distance from his hand to the item, by playing sounds. We conducted an experiment with a board where cup noodles are lined up, imitating a shelf in a grocery store. And we found that a visually-impaired is likely to grab a correct product in 10-15 seconds on average, hearing indication from RSG interface. RSG interface is a milestone in creating the fully-automated GroZi system with functions such as object recognition and object-tracking in the future. Figure 1: Remote Sighted Guide Interface 2. Questions and Means of Solution 2.1. Questions Visually Impaired (user) could be guided to catch a correct product (target) through RSG interface. However, there are no functions such as object recognition and object tracking in current RSG interface. At this time, a camera is set on the shoulder of the user. Therefore, if the user moves much from the target, RSG has to reset the location of the target. For this problem, object tracking technology will certainly help to reduce the burden of RSG. In this CSE-190a course, I am planning to study the effectiveness of object tracking for this question. 2.2. Means for solving the question When implementing object tracking, it’s a significant problem to detect the direction and distance of motion. In grocery stores, many of products are put in order with shelves so that customers can find what they want to buy easily. It’s the same as in other countries, too. This organized structure is effective to locate products in grocery stores for sure. Therefore, it would be meaningful to leverage visual information on this kind of prepared “structure” for object Object Tracking for Remote Sighted Guide Interface on GroZi Project Masaaki Kokawa Department of Computer Science and Engineering University of California, San Diego 9500 Gilman Drive, La Jolla, CA 92093-0404 [email protected] tracking at any grocery store. More specifically, I will test a way to observe the direction and distance of motion by detecting edge lines or aspects of display shelves in grocery stores. Typically, shelf lines are parallel to the floor, and will appear above and under products. They will work as references of the coordinate system in the camera view. The proposed algorithm of the object tracking is as follows (under consideration): 1. Detecting a pair of lines (sometimes one line?) corresponding to edge lines or aspects of display shelves, which are nearest to the center of the captured image, retrieved by Hough transform 2. Set “the middle point” of a pair of lines 3. Obtaining a captured image in next frame 4. Detecting a new pair of lines which are nearest to “the middle point”, retrieved by Hough transform 5. Assuming that these two lines are same as lines in 1. , and calculate “new middle point” of a new pair of lines 6. The difference of “the middle point” and “new middle point” will give the information on the direction and distance of motion 7. Run generic object tracking algorithms with a modified initial location by the observed direction and distance of motion It might be essential to detect a pair of lines frequently enough to observe the direction and distance of motion accurately. 3. Milestones Week 1: Build Hough Transform component to detect lines from captured image by web camera (Microsoft-VX6000) Week 2-3: Selecting appropriate filters for stable detections of a pair of lines, as preprocessing Week 4-5: Build a component to detect a pair of lines and “middle point” for observation of direction and distance of motion Week 5-7: Applying genetic object tracking algorithms (such as MeanShift, CamShift…) for the program, with a modified initial location by the observed direction and distance of motion Week 8-9: Testing the program and Tuning parameters on Hough Transform and object tracking algorithm, and frequency of detecting lines Week 10: Wrapping up the project I would also like to test pattern recognition for detection of edge lines or aspects of display shelves, and another object tracking algorithms, when circumstances allow. 4. Software Microsoft Visual C++ 2008 Express Edition and OpenCV 2.0/ 1.1p 5. Dataset If pictures and movies of shelves are available at a certain grocery store (for example, Sunshine Market at UCSD), these data will be used. In case they are not available, a board for experiment should be prepared for taking pictures and movies as dataset. 6. Qualifications I’m a visiting scholar of Department of Computer Science and Engineering at University of California, San Diego. And I have taken Image Processing (CSE 166) and Computer Vision I (CSE 252A), in Fall 2009. References [1] Landon Carlson, Daniel Fan, Satendra Singh and Matthew Sturtevant GroZi Shopping Assistant MAE 156B (Principles of Mechanical Design) Final Report, Spring 2007. [2] Grace Sze-en Foo, Grocery Shopping Assistant for the Blind / Visually Impaired (GroZi) UCSD


View Full Document

UCSD CSE 190 - Object Tracking for Remote Sighted Guide Interface

Documents in this Course
Tripwire

Tripwire

18 pages

Lecture

Lecture

36 pages

Load more
Download Object Tracking for Remote Sighted Guide Interface
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Object Tracking for Remote Sighted Guide Interface and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Object Tracking for Remote Sighted Guide Interface 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?