Modeling LightWhat is light?What do we see?Slide 4On Simulating the Visual ExperienceThe Plenoptic FunctionGrayscale snapshotColor snapshotA movieHolographic movieSlide 11Sampling Plenoptic Function (top view)RayHow can we use this?Ray ReuseOnly need plenoptic surfaceSynthesizing novel viewsLumigraph / LightfieldLumigraph - OrganizationSlide 20Slide 21Slide 22Slide 23Slide 24Slide 25Slide 26Lumigraph - RenderingStanford multi-camera arrayLight field photography using a handheld plenoptic cameraConventional versus light field cameraSlide 31Prototype cameraDigitally stopping-downDigital refocusingExample of digital refocusingDigitally moving the observerExample of moving the observerMoving backward and forward3D LumigraphSlide 41Slide 422D: ImageImageSpherical PanoramaOther ways to sample Plenoptic FunctionSpace-time imagesThe “Theatre Workshop” MetaphorPainter (images)Lighting Designer (environment maps)Sheet-metal Worker (geometry)… working togetherMidterm next TuesdayMidterm ReviewModeling Light15-463: Computational PhotographyAlexei Efros, CMU, Fall 2007© Michal HavlikWhat is light?Electromagnetic radiation (EMR) moving along rays in space•R() is EMR, measured in units of power (watts)– is wavelengthUseful things:•Light travels in straight lines•In vacuum, radiance emitted = radiance arriving •i.e. there is no transmission lossPoint of observationFigures © Stephen E. Palmer, 2002What do we see?3D world 2D imagePoint of observationWhat do we see?3D world 2D imagePainted backdropOn Simulating the Visual ExperienceJust feed the eyes the right data•No one will know the difference!Philosophy:•Ancient question: “Does the world really exist?”Science fiction:•Many, many, many books on the subject, e.g. slowglass•Latest take: The MatrixPhysics:•Slowglass might be possible?Computer Science:•Virtual RealityTo simulate we need to know:What does a person see?The Plenoptic FunctionQ: What is the set of all things that we can ever see?A: The Plenoptic Function (Adelson & Bergen)Let’s start with a stationary person and try to parameterize everything that he can see…Figure by Leonard McMillanGrayscale snapshotis intensity of light •Seen from a single view point•At a single time•Averaged over the wavelengths of the visible spectrum(can also do P(x,y), but spherical coordinate are nicer)P()Color snapshotis intensity of light •Seen from a single view point•At a single time•As a function of wavelengthP()A movieis intensity of light •Seen from a single view point•Over time•As a function of wavelengthP(,t)Holographic movieis intensity of light •Seen from ANY viewpoint•Over time•As a function of wavelengthP(,t,VX,VY,VZ)The Plenoptic Function•Can reconstruct every possible view, at every moment, from every position, at every wavelength•Contains every photograph, every movie, everything that anyone has ever seen! it completely captures our visual reality! Not bad for a function…P(,t,VX,VY,VZ)Sampling Plenoptic Function (top view)Just lookup -- Quicktime VRRayLet’s not worry about time and color:5D•3D position•2D directionP(VX,VY,VZ)Slide by Rick Szeliski and Michael CohenSurface Camera No Change in RadianceLightingHow can we use this?Ray ReuseInfinite line•Assume light is constant (vacuum)4D•2D direction•2D position•non-dispersive mediumSlide by Rick Szeliski and Michael CohenOnly need plenoptic surfaceSynthesizing novel viewsSlide by Rick Szeliski and Michael CohenLumigraph / LightfieldOutside convex space4DStuffEmptySlide by Rick Szeliski and Michael CohenLumigraph - Organization 2D position2D directionsSlide by Rick Szeliski and Michael CohenLumigraph - Organization 2D position2D position2 plane parameterizationsuSlide by Rick Szeliski and Michael CohenLumigraph - Organization 2D position2D position2 plane parameterizationusts,tu,vvs,tu,vSlide by Rick Szeliski and Michael CohenLumigraph - OrganizationHold s,t constantLet u,v varyAn images,tu,vSlide by Rick Szeliski and Michael CohenLumigraph / LightfieldLumigraph - Capture Idea 1•Move camera carefully over s,t plane•Gantry–see Lightfield papers,tu,vSlide by Rick Szeliski and Michael CohenLumigraph - Capture Idea 2•Move camera anywhere•Rebinning–see Lumigraph papers,tu,vSlide by Rick Szeliski and Michael CohenLumigraph - RenderingFor each output pixel•determine s,t,u,v•either•use closest discrete RGB•interpolate near valuessuSlide by Rick Szeliski and Michael CohenLumigraph - RenderingNearest•closest s•closest u•draw itBlend 16 nearest•quadrilinear interpolationsuSlide by Rick Szeliski and Michael CohenStanford multi-camera array•640 × 480 pixels ×30 fps × 128 cameras•synchronized timing•continuous streaming•flexible arrangementLight field photography using a handheld plenoptic cameraRen Ng, Marc Levoy, Mathieu Brédif,Gene Duval, Mark Horowitz and Pat Hanrahan © 200 Marc LevoyConventional versus light field camera Marc LevoyConventional versus light field camerauv-plane st-planePrototype camera•4000 × 4000 pixels ÷ 292 × 292 lenses = 14 × 14 pixels per lensContax medium format camera Kodak 16-megapixel sensorAdaptive Optics microlens array 125μ square-sided microlensesMarc LevoyDigitally stopping-down•stopping down = summing only the central portion of each microlensΣΣ Marc LevoyDigital refocusing•refocusing = summing windows extracted from several microlensesΣΣ Marc LevoyExample of digital refocusing Marc LevoyDigitally moving the observer•moving the observer = moving the window we extract from the microlensesΣΣ Marc LevoyExample of moving the observer Marc LevoyMoving backward and forward3D LumigraphOne row of s,t plane•i.e., hold t constants,tu,v3D LumigraphOne row of s,t plane•i.e., hold t constant•thus s,u,v•a “row of images”su,vby David DeweyP(x,t)2D: ImageWhat is an image?All rays through a point•Panorama?Slide by Rick Szeliski and Michael CohenImageImage plane2D•positionSpherical PanoramaAll light rays through a point form a ponoramaTotally captured in a 2D array -- P()Where is the geometry???See also: 2003 New Years Evehttp://www.panoramas.dk/fullscreen3/f1.htmlOther ways to
View Full Document