This preview shows page 1-2 out of 6 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Appears in the Proceedings of the 1992 IEEE Conference on Robotics and Automation (ICRA-92), pp. 2719Ð2724.SSS: A Hybrid Architecture Applied to Robot NavigationJonathan H. ConnellIBM T.J. Watson Research Center, Box 704, Yorktown Heights NY 10598AbstractThis paper describes a new three layer architecture, SSS,for robot control. It combines a servo-control layer, aÒsubsumption" layer, and a symbolic layer in a way thatallows the advantages of each technique to be fullyexploited. The key to this synergy is the interface betweenthe individual subsystems. We describe how to buildsituation recognizers that bridge the gap between the servoand subsumption layers, and event detectors that link thesubsumption and symbolic layers. The development ofsuch a combined system is illustrated by a fullyimplemented indoor navigation example. The resulting realrobot, called ÒTJ", is able automatically map officebuilding environments and smoothly navigate throughthem at the rapid speed of 2.6 feet per second.1: IntroductionIn the ÒSSS" architecture (an acronym for Òservo,subsumption, symbolic" systems) we have tried tocombine the best features of conventional servo-systemsand signal processing, with multi-agent reactive controllersand state-based symbolic AI systems.For instance, servo-controllers have trouble with manyreal-world phenomena which are not understood wellenough to be modelled accurately or which are non-linear.Behavior-based or subsumption systems (e.g. [2, 5]), onthe other hand, do not impose as many modellingconstraints on the world and are good at making rapid,radical decisions. Yet such systems often yield jerkymotions due to their slow sample rate and their discreteview of the world. This shortcoming can in turn be easilyrectified by adding appropriate servo-systems which areparticularly good at making smooth motions.Behavior-based systems also have problems with worldmodelling and persistent state. Since behavior-basedsystems are often implemented in a distributed fashion,there is no good place to put a world model. Indeed, manyof the adherents of this school claim that this is abeneficial feature of such systems [3]. However, for sometasks, such as navigation, it is certainly convenient tohave higher-level centralized representations. This is thefortŽ of standard hierarchical symbolic programminglanguages. The usual stumbling block of such systems,real-time control, can be finessed by delegating tacticalauthority to the subsumption and servo control layers.SubsumptionSymbolic eventdetectors∆s, ∆t processparametrizationServo situationrecognizers setpointselectionSensorsActuators∆s, dtds, dtFigure 1 - The SSS architecture combines 3 controltechniques which can be characterized by their treatment oftime and space. Special interfaces allow the layers of thissystem to cooperate effectively.The three layers in our system come from progressivelyquantizing first space then time. As shown in figure 1, theservo-style system basically operates in a domain ofcontinuous time and continuous space. That is, thesesystems constantly monitor the state of the world andtypically represent this state as an ensemble of scalarvalues. Behavior-based system also constantly check theirsensors but their representations tend to be special-purposerecognizers for certain types of situations. In this waybehavior-based systems discretize the possible states of theworld into a small number of special task-dependentcategories. Symbolic systems take this one step further andalso discretize time on the basis of significant events. Theycommonly use terms such as Òafter X do Y" and ÒperformA until B happens". Since we create temporal events onthe basis of changes in spatial situations, it does not makesense for us to discretize time before space. For the samereason, we do not include a fourth layer in which space iscontinuous but time is discrete.In order to use these three fairly different technologies wemust design effective interfaces between them. The firstinterface is the command transformation between thebehavior-based layer and the underlying servos.Subsumption-style controllers typically act by adjustingthe setpoints of the servo-loops, such as the wheel speedcontroller, to one of a few values. All relevant PIDcalculations and trapezoidal profile generation are thenperformed transparently by the underlying servo system.The sensory interface from a signal-processing front-endto the subsumption controller is a little more involved. Aproductive way to view this interpretation process is in thecontext of Òmatched filters" [16, 6]. The idea here is that,for a particular task, certain classes of sensory states areequivalent since they call for the same motor response bythe robot. There are typically some key features that, forthe limited range of experiences the robot is likely toencounter, adequately discriminate the relevant situationsfrom all others. Such Òmatched filter" recognizers are themechanism by which spatial parsing occurs.The command interface between the symbolic andsubsumption layers consists of the ability to turn eachbehavior on or off selectively [7], and to parameterizecertain modules. These event-like commands are Òlatched"and continue to remain in effect without requiring constantrenewal by the symbolic system.The sensor interface between the behavior-based layer andthe symbolic layer is accomplished by a mechanism whichlooks for the first instant in which various situationrecognizers are all valid. For instance, when the robot hasnot yet reached its goal but notices that it has not beenable to make progress recently, this generates a Òpath-blocked" event for the symbolic layer. To help decouplethe symbolic system from real-time demands we have addeda structure called the Òcontingency table". This table allowsthe symbolic system to pre-compile what actions to takewhen certain events occur, much as baseball outfieldersyell to each other Òthe play's to second" before a pitch. Theentries in this table reflect what the symbolic systemexpects to occur and each embodies a one-step plan forcoping with the actual outcome.Figure 2 - TJ has a 3-wheeled omni-directional base and usesboth sonar ranging and infrared proximity detection fornavigation. All servo-loops, subsumption modules, and thecontingency table are on-board. The robot receives pathsegment commands over a spread-spectrum radio link.2: The navigation taskThe task for our robot (shown in figure 2) is to


View Full Document

USC CSCI 584 - jhc-sss

Documents in this Course
Load more
Download jhc-sss
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view jhc-sss and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view jhc-sss 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?