DOC PREVIEW
Berkeley COMPSCI 294 - Image Processing Techniques and Smart Image Manipulation

This preview shows page 1-2-3-4-5-6 out of 18 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 18 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 18 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 18 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 18 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 18 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 18 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 18 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

1Image Processing Techniques and Smart Image Manipulation Maneesh Agrawala Topics Texture Synthesis High Dynamic Range Imaging Bilateral Filter Gradient-Domain Techniques Matting Graph-Cut Optimization Least-Squares Optimization Color … Topics Texture Synthesis High Dynamic Range Imaging Bilateral Filter Gradient-Domain Techniques Matting Graph-Cut Optimization Least-Squares Optimization Color … Texture Synthesis Slides from: Alexei Efros, CMU, Fall 2008 Ganesh Ramanarayanan Cornell Weather Forecasting for Dummies™ Let’s predict weather: • Given today’s weather only, we want to know tomorrow’s • Suppose weather can only be {Sunny, Cloudy, Raining} The “Weather Channel” algorithm: • Over a long period of time, record: – How often S followed by R – How often S followed by S – Etc. • Compute percentages for each state: – P(R|S), P(S|S), etc. • Predict the state with highest probability! • It’s a Markov Chain Markov Chain What if we know today and yestarday’s weather?2Text Synthesis [Shannon,’48] proposed a way to generate English-looking text using N-grams: • Assume a generalized Markov model • Use a large text to compute prob. distributions of each letter given N-1 previous letters • Starting from a seed repeatedly sample this Markov chain to generate new letters • Also works for whole words WE NEED TO EAT CAKE Mark V. Shaney (Bell Labs) Results (using alt.singles corpus): • “As I've commented before, really relating to someone involves standing next to impossible.” • “One morning I shot an elephant in my arms and kissed him.” • “I spent an interesting evening recently with a grain of salt”  Video Textures Arno Schödl Richard Szeliski David Salesin Irfan Essa Microsoft Research, Georgia Tech Still photos Video clips Video textures3Problem statement video clip video texture Our approach How do we find good transitions? Finding good transitions Compute L2 distance Di, j between all frames Similar frames make good transitions frame i vs. frame j Markov chain representation Similar frames make good transitions Transition costs Transition from i to j if successor of i is similar to j Cost function: Cij = Di+1, j Transition probabilities Probability for transition Pij inversely related to cost: Pij ~ exp ( – Cij / 2 ) high  low 4Preserving dynamics Preserving dynamics Preserving dynamics Cost for transition ij Cij = wk Di+k+1, j+k Preserving dynamics – effect Cost for transition ij Cij = wk Di+k+1, j+k Dead ends No good transition at the end of sequence Future cost • Propagate future transition costs backward • Iteratively compute new cost Fij = Cij +  mink Fjk5Future cost • Propagate future transition costs backward • Iteratively compute new cost Fij = Cij +  mink Fjk Future cost • Propagate future transition costs backward • Iteratively compute new cost Fij = Cij +  mink Fjk Future cost • Propagate future transition costs backward • Iteratively compute new cost Fij = Cij +  mink Fjk • Propagate future transition costs backward • Iteratively compute new cost Fij = Cij +  mink Fjk • Q-learning Future cost Future cost – effect Finding good loops • Alternative to random transitions • Precompute set of loops up front6Video portrait Useful for web pages Region-based analysis • Divide video up into regions • Generate a video texture for each region Automatic region analysis User selects target frame range User-controlled video textures slow variable fast Video-based animation • Like sprites computer games • Extract sprites from real video • Interactively control desired motion ©1985 Nintendo of America Inc. Video sprite extraction7Video sprite control • Augmented transition cost: Video sprite control • Need future cost computation • Precompute future costs for a few angles. • Switch between precomputed angles according to user input • [GIT-GVU-00-11] Interactive fish Summary • Video clips  video textures • define Markov process • preserve dynamics • avoid dead-ends • disguise visual discontinuities Discussion • Some things are relatively easy Discussion • Some are hard8“Amateur” by Lasse Gjertsen http://www.youtube.com/watch?v=JzqumbhfxRo Michel Gondry train video http://youtube.com/watch?v=qUEs1BwVXGA Texture • Texture depicts spatially repeating patterns • Many natural phenomena are textures radishes rocks yogurt Texture Synthesis • Goal of Texture Synthesis: create new samples of a given texture • Many applications: virtual environments, hole-filling, texturing surfaces The Challenge • Need to model the whole spectrum: from repeated to stochastic texture repeated stochastic Both? Heeger Bergen 1995 • Seminal paper that introduced texture synthesis to the graphics community • Algorithm: – Initialize J to noise – Create multiresolution pyramids for I and J – Match the histograms of J’s pyramid levels with I’s pyramid levels – Loop until convergence – Can be generalized to 3D9Heeger Bergen 1995 - Algorithm • Image pyramids – Gaussian – Laplacian • Steerable pyramids [SimoncelliFreeman95] – b): multiple scales of oriented filters – c): a sample image – d): results of filters in b) applied to c) Heeger Bergen 1995 - Results I J Successes Failures I J Heeger Bergen 1995 - Results Heeger Bergen 1995 - Verdict • Texture model: – Histograms of responses to various filters • Avoiding copying: – Inherent in algorithm • No user intervention required • Captures stochastic textures well • Does not capture structure – Lack of inter-scale constraints De Bonet 1997 • Propagate constraints downwards by matching statistics all the way up the pyramid • Feature vector: multiscale collection of filter responses for a given pixel • Algorithm: – Initialize J to empty image – Create multiresolution pyramids for I and J – For each pixel in level of J, randomly choose pixel from corresponding level of I that has similar feature vector De Bonet 1997 - Algorithm • 6 feature vectors shown • Notice how they share


View Full Document

Berkeley COMPSCI 294 - Image Processing Techniques and Smart Image Manipulation

Documents in this Course
"Woo" MAC

"Woo" MAC

11 pages

Pangaea

Pangaea

14 pages

Load more
Download Image Processing Techniques and Smart Image Manipulation
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Image Processing Techniques and Smart Image Manipulation and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Image Processing Techniques and Smart Image Manipulation 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?