DOC PREVIEW
UT CS 395T - LECTURE NOTES

This preview shows page 1-2-3-27-28-29 out of 29 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 29 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 29 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 29 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 29 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 29 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 29 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 29 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

OverviewCamerasVarianceSlide 4Variance ReductionBiasingUnbiased EstimateImportance SamplingSlide 9ExampleExamplesIrradianceCosine Weighted DistributionSampling a CircleShirley’s MappingStratified SamplingMitchell 91DiscrepancyTheorem on Total VariationQuasi-Monte Carlo PatternsHammersly PointsEdge DiscrepancyLow-Discrepancy PatternsHigh-dimensional SamplingBlock DesignSlide 26Space-time PatternsPath TracingViews of IntegrationCS348B Lecture 9 Pat Hanrahan, Spring 2005OverviewEarlier lectureStatistical sampling and Monte Carlo integrationLast lectureSignal processing view of samplingTodayVariance reductionImportance samplingStratified samplingMultidimensional sampling patternsDiscrepancy and Quasi-Monte CarloLatterPath tracing for interreflectionDensity estimationCS348B Lecture 9 Pat Hanrahan, Spring 2005Cameras( , , ) ( ) ( ) cosT AR L x t P x S t dA d dtw q wW=�����Source: Cook, Porter, Carpenter, 1984Source: Mitchell, 1991Depth of FieldMotion BlurCS348B Lecture 9 Pat Hanrahan, Spring 2005Variance1 shadow ray per eye ray 16 shadow rays per eye rayCS348B Lecture 9 Pat Hanrahan, Spring 2005VarianceDefinitionVariance decreases with sample size22 2[ ] [( [ ]) ][ ] [ ]V Y E Y E YE Y E Y� -= -2 21 11 1 1 1[ ] [ ] [ ] [ ]N Ni ii iV Y V Y NV Y V YN N N N= == = =� �CS348B Lecture 9 Pat Hanrahan, Spring 2005Variance ReductionEfficiency measureIf one technique has twice the variance as another technique, then it takes twice as many samples to achieve the same varianceIf one technique has twice the cost of another technique with the same variance, then it takes twice as much time to achieve the same varianceTechniques to increase efficiencyImportance samplingStratified sampling1EfficiencyVariance Cost��CS348B Lecture 9 Pat Hanrahan, Spring 2005BiasingPreviously used a uniform probability distributionCan use another probability distributionBut must change the estimator~ ( )iX p x( )( )iiif XYp X=CS348B Lecture 9 Pat Hanrahan, Spring 2005Unbiased EstimateProbabilityEstimator~ ( )iX p x( )( )iiif XYp X=( )[ ]( )( )( )( )( )iiiiif XE Y Ep Xf Xp x dxp Xf x dxI� �=� �� �� �=� �� �==��CS348B Lecture 9 Pat Hanrahan, Spring 2005Importance Sampling( )( )[ ]1( )[ ]1f xp x dx dxE ff x dxE f===� ��%( )( )[ ]f xp xE f=%Sample according to fCS348B Lecture 9 Pat Hanrahan, Spring 2005Importance SamplingVariance2 2[ ] [ ] [ ]V f E f E f= -2222( )[ ] ( )( )( ) ( )( ) / [ ] [ ][ ] ( )[ ]f xE f p x dxp xf x f xdxf x E f E fE f f x dxE f� �=� �� �� �=� �� �==���%%%( )( )[ ]f xp xE f=%( )( )( )f xf xp x=%%Sample according to f2[ ] 0V f =%Zero variance!CS348B Lecture 9 Pat Hanrahan, Spring 2005Examplemethod Samplingfunctionvariance Samples needed for standard error of 0.008importance(6-x)/16 56.8N-1 887,500importance1/4 21.3N-1 332,812importance(x+2)/16 6.4N-1 98,432importancex/8 0 1stratified 1/4 21.3N-3 70408xdxIPeter Shirley – Realistic Ray TracingCS348B Lecture 9 Pat Hanrahan, Spring 2005ExamplesProjected solid angle4 eye rays per pixel100 shadow raysArea4 eye rays per pixel100 shadow raysCS348B Lecture 9 Pat Hanrahan, Spring 2005IrradianceGenerate cosine weighted distribution2( )cosi i i iHE L dw q w=�( ) cosp d dw w q w=CS348B Lecture 9 Pat Hanrahan, Spring 2005Cosine Weighted Distribution2022020))(cos21(2)sin()cos()cos(2dddH)()()sin()cos(),()sin()cos(),(pppddddp 21)( p21U12 U221)(0000dP)sin()cos(2)(p)(sin)(sin)sin()cos(2)(02002000dP)(sin22U)arcsin(2UCS348B Lecture 9 Pat Hanrahan, Spring 2005Sampling a Circle122 Ur Uq p==Equi-ArealCS348B Lecture 9 Pat Hanrahan, Spring 2005Shirley’s Mapping1214r UUUpq==CS348B Lecture 9 Pat Hanrahan, Spring 2005Stratified SamplingStratified sampling is like jittered samplingAllocate samples per regionNew varianceThus, if the variance in regions is less than the overall variance, there will be a reduction in resulting varianceFor example: An edge through a pixel211[ ] [ ]NN iiV F V FN==�2 1.511 [ ][ ] [ ]NEN jiV FV F V FN N== =�11NN iiF FN==�CS348B Lecture 9 Pat Hanrahan, Spring 2005Mitchell 91Uniform random Spectrally optimizedCS348B Lecture 9 Pat Hanrahan, Spring 2005Discrepancyxy( , )( , )( , ) number of samples in n x yx y xyNA xyn x y AD = -=,max ( , )Nx yD x y= DCS348B Lecture 9 Pat Hanrahan, Spring 2005Theorem on Total VariationTheorem: Proof: Integrate by parts11( ) ( ) ( )Ni Nif X f x dx V f DN=- ���( ) ( )1ix x xx Nd�D -= -�10( )( ) [ 1]( )( )( ) ( )( ) ( )( )( )iN Nx xf x dxNxf x dxxf x f xf x dx x dxx xf xD dx V f Dxd --�D=�� �= D - D =- D� ��� =���� ��CS348B Lecture 9 Pat Hanrahan, Spring 2005Quasi-Monte Carlo PatternsRadical inverse (digit reverse) of integer i in integer base bHammersley pointsHalton points (sequential)2 1 00 1 2( ) 0.ib ii d d d di d d d df=�LL1 1 .1 1/22 10 .01 1/43 11 .11 3/44 100 .001 3/85 101 .101 5/82( )if2 3 5( / , ( ), ( ), ( ), )i N i i iff f L1log( )dNND ON-=2 3 5( ( ), ( ), ( ), )i i iff f Llog( )dNND ON=CS348B Lecture 9 Pat Hanrahan, Spring 2005Hammersly Points2 3 5( / , ( ), ( ), ( ), )i N i i iff f LCS348B Lecture 9 Pat Hanrahan, Spring 2005Edge Discrepancyax by c+ +Note: SGI IR Multisampling extension: 8x8 subpixel grid; 1,2,4,8 samplesCS348B Lecture 9 Pat Hanrahan, Spring 2005Low-Discrepancy PatternsProcess 16 points 256 points 1600 pointsZaremba 0.0504 0.00478 0.00111Jittered 0.0538 0.00595 0.00146Poisson-Disk0.0613 0.00767 0.00241N-Rooks 0.0637 0.0123 0.00488Random 0.0924 0.0224 0.00866Discrepancy of random edges, From Mitchell (1992)Random sampling converges as N-1/2Zaremba converges faster and has lower discrepancyZaremba has a relatively poor blue noise spectraJittered and Poisson-Disk recommendedCS348B Lecture 9 Pat Hanrahan, Spring 2005High-dimensional SamplingNumerical quadratureFor a given error …Random samplingFor a given variance …11 1~dEnN=1/ 21/ 21~ ~E VNMonte Carlo requires fewer samplesfor the same error in high dimensional spacesCS348B Lecture 9 Pat Hanrahan, Spring 2005Block DesignabcdabcdabcdabcdAlphabet of size nEach symbol appears exactly once ineach row and columnRows and columns are stratifiedLatin SquareCS348B Lecture 9 Pat Hanrahan, Spring 2005Block DesignaaaaN-Rook PatternIncomplete block


View Full Document

UT CS 395T - LECTURE NOTES

Documents in this Course
TERRA

TERRA

23 pages

OpenCL

OpenCL

15 pages

Byzantine

Byzantine

32 pages

Load more
Download LECTURE NOTES
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view LECTURE NOTES and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view LECTURE NOTES 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?