DOC PREVIEW
Flow-based Video Synthesis and Editing

This preview shows page 1 out of 4 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 4 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 4 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Flow-based Video Synthesis and EditingKiran S. Bhat Steven M. Seitz∗Jessica K. Hodgins Pradeep K. KhoslaCarnegie Mellon University∗University of WashingtonFigure 1: Synthesizing new video by manipulating flow lines. Images from left to right: one frame of input video, flow lines marked by theuser, flow lines marked on the edited video and one frame of the edited video.AbstractThis paper presents a novel algorithm for synthesizing and editingvideo of natural phenomena that exhibit continuous flow patterns.The algorithm analyzes the motion of textured particles in the in-put video along user-specified flow lines, and synthesizes seamlessvideo of arbitrary length by enforcing temporal continuity along asecond set of user-specified flow lines. The algorithm is simple toimplement and use. We used this technique to edit video of water-falls, rivers, flames, and smoke.CR Categories: I.3.3 [Computer Graphics]: Picture/Image Gener-ation; I.3.6 [Computer Graphics]: Methodology and Techniques—Interaction techniques; I.4.8 [Image Processing and Computer Vi-sion]: Scene Analysis—Motion;Keywords: Texture and Video Synthesis, Particle Systems, Imageand Video Processing1 IntroductionReal footage of natural phenomena has a complexity and beautythat is rarely matched in synthetic footage in spite of many recentadvances in simulation, rendering, and post-processing. Leveragingreal footage for special effects is difficult, however, because a nat-ural scene may not match the director’s intentions and modifyingthe physical setting may be expensive or impossible. In this paper,we present a technique that allows the user to modify real footagewhile maintaining its natural appearance and complexity.The key to our approach is the observation that video of a classof natural phenomena can be approximated by continuous motionof particles along well-defined flow lines. First, we capture the dy-namics and texture variation of the particles along user-defined flowlines in the input video. To generate video of arbitrary length, wesynthesize particles such that they complete their full paths alongeach flow line. Playing back these particles along new flow linesallows us to make interesting edits to the original video (Figure 1).The user defines flow lines on both the input and output video andwe leverage his or her visual intuition to create a wide range ofedits. We demonstrate the power of this approach by modifyingscenes of waterfalls, a river, flames, and smoke.2 Related WorkCreating realistic animations of fluid flow is an active area of re-search in computer graphics. Physically based simulation tech-niques have been successfully applied to simulate and control fluids(e.g., [Treuille et al. 2003]). However, these techniques are compu-tationally expensive and are usually tailored for a single type ofnatural phenomena such as smoke, water, or fire.Recently, several researchers have attempted to model the tex-tured motion of fluidic phenomena in video and synthesize new(and typically longer) image sequences. Non-parametric modelsfor texture synthesis have been applied to create 3D temporal tex-tures of fluid-like motion (e.g., [Wei and Levoy 2000]). The videotextures algorithm creates long videos from short clips by con-catenating appropriately chosen subsequences [Sch¨odl et al. 2000].Video sprites extend video textures to allow for high level con-trol over moving objects in video [Sch¨odl and Essa 2002]. Wangand Zhu [2002] model the motion of texture particles in video us-ing a second order Markov chain. Doretto et al. [2003] use Auto-Regressive filters to model and edit the complex motion of fluids invideo. The graph cuts algorithm combines volumes of pixels alongminimum error seams to create new sequences that are longer thanthe original video [Kwatra et al. 2003].Our synthesis approach is very simple and produces comparableresults to the best of these on sequences with continuous flow. Ad-ditionally, our technique allows an artist to specify edits intuitivelyby sketching input and desired flow lines on top of an image.3 ApproachSome natural phenomena such as waterfalls and streams havetime-varying appearance but roughly stationary temporal dynam-ics [Doretto et al. 2003]. For example, the velocity at a single fixedpoint on a waterfall is roughly constant over time. Consequently,these phenomena can be described in terms of particles movingt=1t=2t=3t=4t=5(a)(b)d12d3d4dt=1t=2t=3t=4t=55d(c)(d)1234567 8 91011 12 13 14151617 181920212223 24252627282930Figure 2: A particle-based representation for video. (a) A particlemoving along its flow line. (b) Particle texture moving along thesame flow line over time. (c) Texture variation of a real particlefrom the Niagara sequence (Figure 6). For clarity, we show theparticle texture every 6th frame as it moves along the flow line. Theparticle velocity increases as the particle moves downward as wouldbe expected due to gravity. (d) A filmstrip (left-right, top-bottom)showing the particle texture for each frame as the particle travelsdownward along the flow line. The texture of two adjacent cellsis similar, which facilitates tracking. However, the texture variessignificantly between the beginning and end of the flow line.along fixed flow lines (possibly curved) that particles follow fromthe point at which they enter the image to the point where they leaveor become invisible. For instance, the flow lines in the waterfall inFigure 1 are mostly vertical. Each particle also has an associatedtexture (a patch of pixels), which changes as the particle movesalong the flow line. Our video texture synthesis technique producesseamless, infinite sequences by modelling the motion and texture ofparticles along user-specified flow lines. We first describe the wayin which the particles move in video, and then describe how theyare rendered using texture patches.Particle Dynamics: To begin, consider the case of a single flowline in the image, as shown in Figure 2(a). Any particle that beginsat the start of the flow line d1will pass through a series of positionsd1, d2, . . . , dnduring its trajectory. The particle’s velocity along theflow line may be time-varying; thus the positions dineed not beevenly spaced. The particle’s texture may vary as it moves alongthe flow line (Figure 2(b,c,d)).We represent the temporal evolution of particles along this flowline as follows. Define a matrix M(d, t) = (p, f ), where p refers toa specific particle, and f specifies the frame in the


Flow-based Video Synthesis and Editing

Download Flow-based Video Synthesis and Editing
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Flow-based Video Synthesis and Editing and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Flow-based Video Synthesis and Editing 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?