DOC PREVIEW
Development and Deployment of a Line of Sight Virtual Sensor for Heterogeneous Teams

This preview shows page 1-2 out of 6 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

MAIN MENUPREVIOUS MENU---------------------------------Search CD-ROMSearch ResultsPrintFigure 1. The Millibot Team – a heterogeneous collection of small-scale robots designed on the 5cm scale. Development and Deployment of a Line of Sight Virtual Sensor for Heterogeneous Teams Robert Grabowski, Pradeep Khosla and Howie Choset Carnegie Mellon University Electrical and Computer Engineering, and Mechanical Engineering Department Pittsburgh, Pennsylvania 15213 {grabowski, pkk, choset}@cs.cmu.edu Abstract — For a team of cooperating robots, geometry plays a vital role in operation. Knowledge of line of sight to local obstacles and adjacent teammates is critical in both the movement and planning stages to avoid collisions, maintain formation and localize the team. However, determining if other robots are within the line of sight of one another is difficult with existing sensor platforms – especially as the scale of the robot is reduced. We describe a method of exploiting collective team information to generate a virtual sensor that provides line of sight determination, greater range and resolution and the ability to generalize local sensing. We develop this sensor and apply it to the control of a tightly coupled, resource-limited robot team called Millibots. Keywords-component; mobile robot teams; sensing; heterogeneous control I. INTRODUCTION Robots are versatile machines that can be programmed to react collectively to sensor information in a variety of tasks that range from surveillance and reconnaissance to rescue support. Despite this versatility, a single robot cannot always realize all applications. On the other hand, a team of robots can coordinate action and sensing to extend a collection of individual entities to a single, cohesive group. To facilitate this coordination, a robot team must be able to manage its formation to exchange information and leverage the proximity of the others. Formation control is essential in many aspects of team coordination from communications [1][2] to sensor coverage [7][11] to localization [4][9][12]. A critical component of formation control is line of sight. Line-of-sight is defined as an open, obstacle-free path between two points in space and must be wide enough to allow the passage of information signals such as light, video or ultrasonics. Unfortunately, local sensing is not always sufficient to directly determine the property of line of sight. Local sensors are often limited in their range and resolution and are incapable of discriminating between robot and obstacle. Even when a robot has access to a local map, it still may not have acquired sufficient information to make the determination on its own. This is especially true as the scale of the robot is decreased, the number of available sensors is restricted and the range of local sensing is reduced. Coordinating multiple robots is a management issue as well. Conventional formation control is based on the idea that each robot is equipped with roughly the same sensing capabilities. Heterogeneous team control must take into account for the differences in the sensing and processing capabilities of each robot. In some cases, this sensing may be rudimentary and not able to provide the necessary local information needed to navigate on its own [7][11]. The problem becomes even more compounded when the composition and number of the team is dynamic. Our work is primarily motivated by the control and coordination of a team of heterogeneous, resource-limited robots, called Millibots [7]. These are small-scale robots on the order of 5cm a side that are designed to operate in unknown or partially known environments. Their small size gives them access to tight, inaccessible areas while making them easier to conceal, deploy and manage. However, their small scale and dynamic heterogeneous composition makes conventional control strategies difficult to apply. We address coordination of multiple, heterogeneous robots by developing the concept of a ‘virtual’ sensor. Robot teams have the advantage that they can collectively share information. They are able to fuse range information from a variety of different platforms to build a global occupancy map that represent a single collective view of the environment. A virtual sensor is simply an abstraction of the team’s occupancy map. We call this a virtual sensor because it has all the properties of a real sensor with respect to that robot’s navigation and planning but is derived from information already processed and not from the physical interaction of a sensor and its Proceedings of the 2004 IEEE International Conference on Robotics & Automation New Orleans, LA • April 20040-7803-8232-3/04/$17.00 ©2004 IEEE3024Figure 2. Generating the Virtual Sensor. - a) A robot’s local sensors may not have the resolution to determine line of sight to other members (dark gray profile) b) First step is to map the team’s occupancy map into the individual polar map with the target robot at the center - axes are range and bearing. c) We process the polar map to generate a polar contour - For a given bearing, we mark the closest transition from open space to form obstacle contours and frontier boundaries. Additionally, we project the profile of adjacent teammates onto the map d)We test each point in space against the contour map. All points within this region are within line of sight of the target robot. e) We utilize the values of a contour map to generate a virtual range sensor with greater range and resolution of the robot’s individual sensor. This generalized profile is the same for any robot regardless of underlying sensor platform. surroundings. However, when employed by the individual, the information derived from a virtual sensor can be treated in the same fashion as a real sensor. In Section III, we develop the virtual sensor and show how it can provide essential line of sight information to obstacles, open space and other robots regardless of the platform being employed. In section IV, we show how this generalization aids in local, sensor-based planning by providing information with greater range and resolution than existing local sensors. We then show how it can be extended to the planning stage with respect to maintaining line of sight to multiple members during and after movement. Finally, in Section V, we show how the virtual sensor allows the generalization of existing sensors in such a


Development and Deployment of a Line of Sight Virtual Sensor for Heterogeneous Teams

Download Development and Deployment of a Line of Sight Virtual Sensor for Heterogeneous Teams
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Development and Deployment of a Line of Sight Virtual Sensor for Heterogeneous Teams and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Development and Deployment of a Line of Sight Virtual Sensor for Heterogeneous Teams 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?