DOC PREVIEW
UB CSE 574 - Markov Random Fields

This preview shows page 1-2-3-24-25-26 out of 26 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 26 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 26 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 26 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 26 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 26 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 26 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 26 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Markov Random FieldsTopics1. IntroductionMarkov Random Field Terminology2. Conditional IndependenceConditional Independence TestConditional IndependenceMarkov Blanket for Undirected Graph3. Factorization PropertiesClique in a graphFactors as CliquesGraphical Model as Filter4. Potential Functions5. Illustration: Image de-noisingMarkov Random Field ModelEnergy FunctionsPotential FunctionDe-noising problem statementDe-noising algorithmImage Restoration ResultsSome Observations on de-noising algorithm6. Relation to Directed GraphsConverting to Undirected graphGeneralize construction D-map, I-map and Perfect-MapReferences1Machine Learning CSE 574, Autumn 2007Markov Random FieldsSargur [email protected] Learning CSE 574, Autumn 2007Topics1. Introduction1. Undirected Graphical Models2. Terminology2. Conditional Independence3. Factorization Properties1. Maximal Cliques2. Hammersley-Clifford Theorem4. Potential Function and Energy Function5. Image de-noising example6. Relation to Directed Graphs1. Converting directed graphs to undirected1. Moralization2. D-map, I-map and Perfect map3Machine Learning CSE 574, Autumn 20071. Introduction• Directed graphical models specify– Factorization of joint distribution over set of variables– Define set of conditional independence properties that must be satisfied by any distribution that factorizes according to graph• MRF is an undirected graphical model that also– Specifies a factorization– Conditional independence relations4Machine Learning CSE 574, Autumn 2007Markov Random Field Terminology• Also known as Markov network or undirected graphical model• Set of nodes corresponding to variables or groups of variables• Set of links connecting pairs of nodes• Links are undirected (do not carry arrows)• Conditional independence is an important concept5Machine Learning CSE 574, Autumn 20072. Conditional Independence• In directed graphs conditional independence tested by d-separation– Whether two sets of nodes were blocked– Definition of blocked subtle due to presence of head-to-head nodes• In MRFs asymmetry between parent-child removed – Subtleties with head-to-head no longer arise6Machine Learning CSE 574, Autumn 2007Conditional Independence Test• Identify three sets of nodes A, B and C• To test conditional independence property• Consider all possible paths from nodes in set A to nodes in set B• If all such paths pass through one or more nodes in C then path is blocked and independence holds• If there is a path that is unblocked– May not necessarily hold– There will be at least some distribution for which conditional independence does not holdCBA |⊥7Machine Learning CSE 574, Autumn 2007Conditional Independence• Every path from any node in A to B passes through C• No explaining away– Testing for independence simpler than in directed graphs• Alternative view– Remove all nodes in set C together with all their connecting links– If no paths from A to B then conditional independence holds8Machine Learning CSE 574, Autumn 2007Markov Blanket for Undirected Graph• A simple form for MRFs• A node is conditionally independent of all nodes except for neighboring nodes9Machine Learning CSE 574, Autumn 20073. Factorization Properties• Seek a factorization rule corresponding to conditional independence test described earlier• Notion of locality needed• Consider two nodes xi and xj not connected by a link– They are conditionally independent given all other nodes in graph• Because there is no direct path between them and• All other paths pass through nodes that are observed and hence those paths are blocked– Expressed as– Where denotes set x of all variables with xi and xj removed• For conditional independence to hold– factorization is such that xi and xj do not appear in the same factor– leads to graph concept of clique)x|()x|()x|,(},\{},\{},\{ jijjiijijixpxpxxp=},\{xji10Machine Learning CSE 574, Autumn 2007Clique in a graph• Subset of nodes in graph such that there exists a link between all pairs of nodes in subset– Set of nodes in clique are fully connected• Maximal Clique– Not possible to include any other nodes in the graph in the set without ceasing to be a clique5 cliques oftwo nodesTwo Maximal cliques11Machine Learning CSE 574, Autumn 2007Factors as Cliques• Functions of maximal cliques• Set of variables in clique C is denoted xC• Joint distribution is written as a product of potential functions• Where Z , called the partition function, is a normalization constant)x(1)x(CCCZp∏=ψx(x )CCCZψ=∑∏12Machine Learning CSE 574, Autumn 2007Graphical Model as Filter• UIis set of distributions that are consistent with set of conditional independence statements read from the graph using graph separation• UFare set of distributions that can be expressed as factorization of the form• Hammersley-Clifford theorem states that UIand UFare identical)x(1)x(CCCZp∏=ψ13Machine Learning CSE 574, Autumn 20074. Potential Functions• Potential functions should be strictly positive• Convenient to express them as exponentialswhere E(xC ) is called an energy function• Exponential representation is called Boltzmann distribution• Total energy obtained by adding energies of maximal cliques(x )CCψ{}(x ) exp (x )CC CEψ=−14Machine Learning CSE 574, Autumn 20075. Illustration: Image de-noising• Noise removal from binary image• Observed noisy image– Binary pixel values yi ε{-1,+1}, i=1,..,D• Unknown noise-free image– Binary pixel values xi ε{-1,+1}, i=1,..,D• Noisy image assumed to randomly flip sign of pixels with small probability15Machine Learning CSE 574, Autumn 2007Markov Random Field Model• Known– Strong correlation between input xi and output yi• Since noise level is small– Neighboring pixels xi and xj are strongly correlated• Property of images• This prior knowledge captured using MRF– Whose undirected graph is shown above16Machine Learning CSE 574, Autumn 2007Energy Functions• Graph has two types of cliques• With two variables each1. {xi ,yi } expresses correlation between variables• Choose simple energy function –η xi yi• Lower energy (higher probability) when xi and yi have same sign2. {xi ,xj } which are neighboring pixels• Choose −β xi xj• For same reasons17Machine Learning CSE 574, Autumn


View Full Document

UB CSE 574 - Markov Random Fields

Download Markov Random Fields
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Markov Random Fields and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Markov Random Fields 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?