DOC PREVIEW
UW-Madison CS 779 - CS 779 Lecture Notes

This preview shows page 1-2-14-15-29-30 out of 30 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 30 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 30 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 30 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 30 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 30 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 30 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 30 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Last TimeTodayEnvironment MapsCapturing MapsExample ImagesResulting MapHigh-Dynamic Range MapsHigh Dynamic Range Imaging (Debevec and Malik, SIGGRAPH 1997)Solution: Capture Many ImagesQuantitiesInputSolvingResults – Store MappingResults – Store Log PlotResults – Church InputResults – Church Rendering (Ward’s Histogram Method)Image-Based Renderinge.g. Texture MappingPlenoptic FunctionIBR SystemsMovie-Map ApproachesQuicktime VR (Chen, 1995)Results - WarpingResults - StitchingView Interpolation (Chen and Williams, 1993)View Morphing (Seitz and Dyer, 1997)View MorphingView Morphing ProcessSlide 29Next Time03/07/05 © 2005 University of WisconsinLast Time•Tone Reproduction–Photographically motivated methods–Gradient compression techniques–Perceptual issues03/07/05 © 2005 University of WisconsinToday•High Dynamic Range Environment Maps•Image Based Rendering03/07/05 © 2005 University of WisconsinEnvironment Maps•Environment maps are infinitely distant area lights covering the hemisphere–Maps were just given, with little talk of how they came to be•Probably the most important rendering technique in film special effects–Used when virtual imagery must be put into real filmed environments–Allows the real environment to influence the character’s appearance03/07/05 © 2005 University of WisconsinCapturing Maps•Bring a highly reflective sphere along to the set•Take multiple pictures of the ball–Place the ball in important locations (which ones?)–Take a few pictures around the ball (how many?)•Go home, and stitch the pictures together, and re-project, to get a map03/07/05 © 2005 University of WisconsinExample Images03/07/05 © 2005 University of WisconsinResulting Map•Need to do a re-projection from image space to environment map coordinates03/07/05 © 2005 University of WisconsinHigh-Dynamic Range Maps•The environment map needs higher dynamic range than the final film print–Why?•But cameras are themselves low dynamic range–High dynamic range cameras are becoming available, but you can do better with a standard camera•How do you get a high dynamic range image from a standard camera?03/07/05 © 2005 University of WisconsinHigh Dynamic Range Imaging(Debevec and Malik, SIGGRAPH 1997)•Problem: Limited dynamic range of film or CCDs makes it impossible to capture high dynamic range in a single image•Solution: Take multiple images at different exposures•Problem: How do the pieces get put back together to form a single, composite image–Made difficult because mapping from incoming radiance to pixel values is non-linear and poorly documented•Solution: this paper–Very influential for such a simple idea – used in lots of other papers–Code is available03/07/05 © 2005 University of WisconsinSolution: Capture Many Images03/07/05 © 2005 University of WisconsinQuantities•The output you see – pixel values – from a scanned film or digital camera, is some function of the scene irradiance:•X is the product of irradiance and exposure time:–Assuming the “principle of reciprocity”: double exposure and halving irradiance gives the same output, and vice versa•Aim: recover f to allow inversion from observed values to scene irradiances–Assumption: f is monotonic (surely true, or it’s a useless imaging device) XfZ tEX 03/07/05 © 2005 University of WisconsinInput•A set of images, indexed by j, with known exposure times: tj•Call the observed value in image j at pixel i Zij•Doing some math gives us an equation involving f and Ei:•We want the g and Ei that best represent the given data (the images)    jiijjiijjiijjiijtEZgtEZftEZftEfZlnlnlnlnln1103/07/05 © 2005 University of WisconsinSolving•Solve a linear least squares with the following objective:•Terms for the function and its smoothness, plus weighting terms to give more credence to values with luminance in the mid-range of the dynamic range of the imaging system•Gives results up to a scale, so set mid-range pixel to be unit radiance•Don’t use all the values, just about 50 pixels (chosen by hand) and enough images to cover range          1121 12maxminlnlnZZzNiPjjiijijzgzwtEZgZwO03/07/05 © 2005 University of WisconsinResults – Store Mapping03/07/05 © 2005 University of WisconsinResults – Store Log Plot03/07/05 © 2005 University of WisconsinResults – Church Input03/07/05 © 2005 University of WisconsinResults – Church Rendering(Ward’s Histogram Method)03/07/05 © 2005 University of WisconsinImage-Based Rendering•Geometry and light interaction may be difficult and expensive to model–Imagine the complexity of modeling the exact geometry of carpet (as just one example)•Image based rendering seeks to replace geometry and surface properties with images–May or may not know the viewing parameters for the existing images–Existing images may be photographs or computer generated renderings03/07/05 © 2005 University of Wisconsine.g. Texture Mapping•Use photographs to represent complex reflectance functions•There are variants that seek to do better than standard texture mapping•Store viewing directional specific information–What sort of effects can you get?•Store lighting specific information–What sort of effects can you get?03/07/05 © 2005 University of WisconsinPlenoptic Function•Returns the radiance:–passing through a given point, x–in a given direction, (,)–with given wavelength, –at a given time, t•Many image-based rendering approaches can be cast as sampling from and reconstructing the plenoptic function•Note, function is generally constant along segments of a line (assuming vacuum)03/07/05 © 2005 University of WisconsinIBR Systems•Methods differ in many ways:–The range of new viewpoints allowed–The density of input images–The representation for samples (known images)–The amount of user help required–The amount of additional information required (such as intrinsic camera parameters)–The method for gathering the input images03/07/05 © 2005 University of WisconsinMovie-Map Approaches•Film views from fixed locations, closely spaced, and store–Storage can be an issue•Allow the user to jump from location to location, and pan•Appropriate images are retrieved from disk and displayed•No re-projection – just uses nearest


View Full Document

UW-Madison CS 779 - CS 779 Lecture Notes

Download CS 779 Lecture Notes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view CS 779 Lecture Notes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view CS 779 Lecture Notes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?