DOC PREVIEW
U of M PSY 5036W - Bayes Decision Theory

This preview shows page 1-2-3-23-24-25-26-46-47-48 out of 48 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 48 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 48 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 48 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 48 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 48 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 48 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 48 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 48 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 48 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 48 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 48 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Computational VisionU. Minn. Psy 5036Daniel KerstenLecture 6GoalsLast timeDeveloped signal detection theory for characterizing an ideal observer for detecting a "known" pattern in additive gaussian noise.The statistical treatment is a special case of Bayesian inference.Showed how human and ideal performance can be quantitatively compared by their respective sensitivities, d's.This timeHow to manage complex pattern inference tasks?Extend the tools of signal detection theory to object recognition/estimation.‡Two main observations for simplification: Graphical models of influenceTask dependence: Bayesian inference theory -> Bayesian decision theory, to take into account what information is impor-tant and what is not. I.e. what is signal and what is noise.Some motivation: Examples of object tasksEstimationImagine the top of a coffee mug. It typically has a circular cross-section. However, due to projection, the image on your retina is more like an ellipse from most viewpoints. Now imagine it is a "designer coffee mug" which has an elliptical cross-section. How could you guess the true, i.e. physical 3D shape, from measurements made in the projected image? The "aspect" slider below changes the ratio of the major to minor axes of the coffee mug. The "y" variable changes the slant of your viewpoint. These two causes determine an image measurement x--the height of the projected ellipse in the image (See "Slant" example below).Manipulate@Graphics3D@8EdgeForm@D, Scale@[email protected], -.05, -.0<, 8.0, .05, .0<<, 1 ê2D,81, 1, aspect<D<, Boxed Ø False, ImageSize Ø Tiny,ViewCenter Ø 80, 0, 0<, ViewPoint Ø 80, 10, y<D, 88aspect, 1.0<, .1, 2<,8y, -20, 20<Daspecty2 6_BayesDecisionTheory.nbRecognitionSuppose you are doing some grocery shopping in the fruit and vegetable section. You are looking at a fruit that is either a tomato or an apple. The type of fruit will influence the regularities in measurements you have to decide. For example, the contours of the fruit might be more like what you previously experienced from apples, or from tomatoes. But there might be some ambiguity--silhouettes of apples aren't that different from tomatoes. Another kind of measurement could come from spectral measurements (e.g. from your cone photoreceptors), i.e. from longer-wave vs. shorter-wave parts of the spectrum. But these measurements rely on intermediate variables of material, i.e. the red or green stuff that the skin of the fruit is made of. The figure below shows the generative model. Given shape and wave-length measurements, how can one make the best guess of the fruit type and/or material type? We won't solve this problem in general, but we will look at a very simple version with a view to understanding how different kinds of tasks affect the guesses, even when the generative model remains unchanged.Graphical Models of dependenceThe generative model in the previous lecture was simple. The signals were two fixed images (e.g. a sinusoidal grating and a uniform image), and the image variability was solely due to additive noise.What about natural images? The "universe" of possible factors generating an image could be expressed by constructing the joint probability on all possible combinations of description. For example, suppose we have decided that the key variables to model all natural images can be broken down into descriptions of the scene, object class, environment lighting, object reflectivity, object shape, and that these result in several kinds of data measurements, such as global features, local features, haptic. Then our knowledge of the universe of natural images could be modeled as:p(scene, object class, environment lighting, object reflectivity, object shape, global features, local features, haptic)where each of the variable classes is itself a high-dimensional description. But this is clearly hopelessly large, because of the combinatorial problem. Natural images are complex, and in general it is difficult and often impractical to build a detailed quantitative generative model. But natural images do have regularities, and we can get insight into the problem by considering how various factors or causes might produce natural images. We can also simplify based on assumptions of what kind of information, i.e. which factors, are important to estimate.One way to begin simplifying the problem is to note that not all variables have a direct influence on each other. Imagine you are designing a 3D software environment for quickly generating visual images, perhaps with some touch or haptic output too. We draw a graph in which lines only connect variables that influence each other. We are going to use directed graphs to represent conditional probabilities.6_BayesDecisionTheory.nb 3where each of the variable classes is itself a high-dimensional description. But this is clearly hopelessly large, because of the combinatorial problem. Natural images are complex, and in general it is difficult and often impractical to build a detailed quantitative generative model. But natural images do have regularities, and we can get insight into the problem by considering how various factors or causes might produce natural images. We can also simplify based on assumptions of what kind of information, i.e. which factors, are important to estimate.One way to begin simplifying the problem is to note that not all variables have a direct influence on each other. Imagine you are designing a 3D software environment for quickly generating visual images, perhaps with some touch or haptic output too. We draw a graph in which lines only connect variables that influence each other. We are going to use directed graphs to represent conditional probabilities.Conditional dependence and independence‡Two variables can become independent conditional on a knowledge of a third Two random variables may become independent, once the value of some third variable is known. This is called condi-tional independence. From the probability overview, you note that two random variables are independent if and only if their joint probability is equal to the product of their individual probabilities. Thus, if p(A,B) = p(A)p(B), then A and B are independent. If p(A,B|C) = p(A|C)p(B|C), then A and B are conditionally independent. When corn prices drop in the summer, hay fever incidence goes up. Strange correlations like this suggest a common cause, such as the kind of weather that is conducive to corn and ragweed growth.And if the joint on corn


View Full Document

U of M PSY 5036W - Bayes Decision Theory

Download Bayes Decision Theory
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Bayes Decision Theory and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Bayes Decision Theory 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?