DOC PREVIEW
Berkeley COMPSCI 294 - Active Perception

This preview shows page 1-2-16-17-18-34-35 out of 35 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 35 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 35 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 35 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 35 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 35 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 35 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 35 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 35 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Active PerceptionActive Perception vs. Active SensingActive SensingSlide 4Active Perception turned into an engineering agendaEngineering agenda,cont.Active Vision theoryActive Vision,cont.Control StrategiesBottom up and Top down processGOALS/TASKSGoal/TaskNoteEnvironments/contextEnvironment/contextSlide 16Active Vision System for 3D object recognitionTableTable cont.Comments:Active Visual ObserverThe UPENN SystemPennEyes A Binocular Active Vision SystemPennEyesDesign considerationsPuma PolkaTracking PerformanceBiSight HeadBiSight headC40 ArchitectureSlide 31Critical IssuesSynchronizationOther considerationsControlActive PerceptionWe not only see but we look, we not only touch we feel,JJ.GibsonActive Perception vs. Active Sensing• WHAT IS ACTIVE SENSING?•In the robotics and computer vision literature, the term•“active sensor” generally refers to a sensor that transmits•(generally electromagnetic radiation, e.g., radar, sonar,•ultrasound, microwaves and collimated light) into the environment•and receives and measures the reflected signals.•We believe that the use of active sensors is not a necessary•condition on active sensing, and that sensing can be performed•with passive sensors (that only receive, and do not•emit, information), employed actively.Active Sensing•Hence the problem of Active Sensing can be stated as a•problem of controlling strategies applied to the data acquisition•process which will depend on the current state of the•data interpretation and the goal or the task of the process.•The question may be asked, “Is Active Sensing only an•application of Control Theory?” Our answer is: “No, at least•not in its simple version.” Here is why:Active Perception•1) The feedback is performed not only on sensory data•but on complex processed sensory data, i.e., various•extracted features, including relational features.•2) The feedback is dependent on a priori knowledge and models•that are a mixture of numeric/parametric and•symbolic information.Active Perception turned into an engineering agenda•The implications of the active sensing/perception approach are the•following:•1) The necessity of models of sensors. This is to say, first,•the model of the physics of sensors as well as the noise of•the sensors. Second, the model of the signal processing and data reduction mechanisms that are applied on the measured•data. These processes produce parameters with a definite•range of expected values plus some measure of uncertainties.•These models shall be called Local Models.Engineering agenda,cont.•2) The system (which mirrors the theory) is modular as•dictated by good computer science practices and interactive,•that is, it acquires data as needed. In order to be able•to make predictions on the whole outcome, we need, in•addition to models of each module (as described in 1)•above), models for the whole process, including feedback.•We shall refer to these as Global Models.•3) Explicit specification of the initial and final state /goal.•If the Active Vision theory is a theory, what is its predictive•power? There are two components to our theory, each•with certain predictions:Active Vision theory•1) Local models. At each processing level, local models•are characterized by certain internal parameters. Examples•of local models can be: region growing algorithm with internal•parameters, the local similarity and size of the local•neighborhood. Another example is an edge detection algorithm•with parameter of the width of the band pass filter in•which one is detecting the edge effect. These parameters•predict a) the definite range of plausible values, and b) the•noise and uncertainty which will determine the expected•resolution, sensitivity ,robustness of the output results from•each moduleActive Vision,cont.•2) Global models characterize the overall performance•and make predictions on how the individual modules will•interact which in turn will determine how intermediate•results are combined. The global models also embody the•Global external parameters, the initial and final global state•of the system. The basic assumption of the Active Vision•approach is the inclusion of feedback into the system and•gathering data as needed. The global model represents all•the explicit feedback connection, parameters, and the optimization•criteria which guides the process.Control Strategies•three distinct control stages proceeding in sequence:•initialization,•processing in midterm,•completion of the task.•Strategies are divided with respect to the tradeoff between•how much data measurement the system acquires (data•driven, bottom-up) and how much a priori or acquired•knowledge the system uses at a given stage (knowledge•driven, top-down). Of course, there is that strategy which•combines the two.Bottom up and Top down process•To eliminate possible ambiguities with the terms bottom up•and top-down, we define them here. Bottom-up (data•driven), in this discussion, is defined as a control strategy•where no concrete semantic, context dependent model is•available, as opposed to the top-down strategy where such•knowledge is available.GOALS/TASKS•Different tasks will determine the design of the system, i.e. the architecture.•Consider the following tasks:•Manipulation•Mobility •Communication and Interaction of machine to machine or people to people via digital media or people to machine.Goal/Task•Geographically distributed communication and interaction using multimedia (vision primarily) using the Internet.•We are concerned with primarily unspoken communication: gestures and body motion.•Examples are: coordinated movement such as dance, physical exercises, training of manual skills, remote guidance of physical activities.Note•Recognition , Learning will play a role in all the tasks.Environments/context•Serves as a constraint in the design.•We shall consider only the constraints relevant to the visual task that serves to accomplish the physical activity.•For example: in the manipulation task, the size of the object will determine the data acquisition strategy but also the design of the vision system (choice of field of view, focal length, illumination, and spatial resolution). Think of moving furniture vs. picking up a coin.Environment/context•Another example: Mobility•There is a difference if the mobility is on the ground, in


View Full Document

Berkeley COMPSCI 294 - Active Perception

Documents in this Course
"Woo" MAC

"Woo" MAC

11 pages

Pangaea

Pangaea

14 pages

Load more
Download Active Perception
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Active Perception and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Active Perception 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?