UCF CAP 5937 - SilSketch - Automated Sketch-Based Editing of Surface Meshes

Unformatted text preview:

EUROGRAPHICS Workshop on Sketch-Based Interfaces and Modeling (2007)M. van de Panne, E. Saund (Editors)SilSketch: Automated Sketch-Based Editingof Surface MeshesJohannes Zimmermann Andrew Nealen Marc AlexaTU BerlinAbstractWe introduce an over-sketching interface for feature-preserving surface mesh editing. The user sketches a strokethat is the suggested position of part of a silhouette of the displayed surface. The system then segments all image-space silhouettes of the projected surface, identifies among all silhouette segments the best matching part, derivesvertices in the surface mesh corresponding to the silhouette part, selects a sub-region of the mesh to be modified,and feeds appropriately modified vertex positions together with the sub-mesh into a mesh deformation tool. Theoverall algorithm has been designed to enable interactive modification of the surface – yielding a surface editingsystem that comes close to the experience of sketching 3D models on paper.Categories and Subject Descriptors (according to ACM CCS): I.3.5 [Computer Graphics]: Computational Geometryand Object Modeling - Modeling packages; I.3.6 [Methodology and Techniques]: Interaction techniques; GeneralTerms: Sketch Based Modeling, Deformations, Laplacian Surface Editing, Differential Geometry, Sketching1. IntroductionThe process of generating 3D shapes in engineering or con-tent creation typically goes through several design reviews:renderings of the shapes are viewed on paper or a screen, anddesigners indicate necessary changes. Oftentimes designerssketch replacements of feature lines onto the rendering. Thisinformation is then taken as the basis of the next cycle ofmodifications to the shape.We present a surface mesh editing system motivated bydesign reviews: given nothing but the over-sketch of a fea-ture line, it automatically deforms the mesh geometry to ac-commodate the indicated modification. Building on existingmesh deformation tools [SLCO∗04, NSACO05], the mainfeature of our work is the automatic derivation of all nec-essary parameters that these systems require as input in real-time.In particular, Laplacian Surface Editing [SLCO∗04], butalso most other recent mesh deformation techniques (e.g.,[YZX∗04,BPG06]) require the selection of: handle vertices,the displacement for these handle vertices and a region ofinterest (ROI), representing the part of the mesh to be mod-ified to accommodate the displaced handle vertices. For oursystem, we need to compute this information from the over-sketched feature line alone; and we do this in fractions of asecond. The steps described below comprise our system (seealso Fig. 1) – breaking down the problem into these stepsand performing each step in few milliseconds are the maincontributions of our work:1. Based on the screen projection of the shape, a subset ofpixels lying on potential feature lines is identified. Thesepixels are then segmented and converted to image-spacepolylines as the set of candidate feature lines.2. The user-sketch is matched against all polylines to findthe corresponding part on a feature line.3. Based on the correspondence in image-space, a set ofhandle vertices in the surface mesh is selected. Theimage-space projection of these vertices covers the de-tected part of the feature line.4. New positions for the handle vertices are derived fromthe displacements in image-space between the projectionof the handle vertices and the user’s sketch; these are thenecessary displacements.5. A part of the surface mesh around the handle vertices,computed by region growing, is defined as the ROI.Note that in steps 3,4, and 5 we compute the necessaryinput for shape deformation, while steps 1 and 2 are requiredto identify the input, based only on the user-sketch.c The Eurographics Association 2007.Zimmermann, Nealen, Alexa / SilSketch: Automated Sketch-Based Editing of Surface MeshesFigure 1: Algorithm pipeline. Top row, from left to right: a) user-sketch, b) image-space silhouettes, c) retained silhouettes afterproximity culling, d) handle estimation; Bottom row, left to right: e) correspondences and ROI estimation by bounding volumes,f) setup for Laplacian Surface Editing, g) and h) deformation result. Note that the user only sees a), g) and h).2. Related Work and System DesignSketch-based interfaces are a very popular meth od forcreation and deformation of 3D surface meshes [IMT99,KSvdP07, KS07]. Deriving the parameters for mesh de-formation from sketches only is not new: Kho and Gar-land [KG05] derive ROI and handle vertices from sketchingonto the projected shape, essentially implying a skeleton fora cylindrical part. A second stroke then suggests a modifi-cation of the skeleton, and the shape is deformed accordingto the deformed skeleton. However, according to Hoffmanand Singh [HS97], we recognize objects mainly by a fewfeature lines, namely silhouettes and concave creases. Sincethe process of paper-based sketching relies exactly on thesefeatures, we feel it is more natural to use them as the basisfor our over-sketching mesh deformation tool. This line ofthought is similar to Nealen et al. [NSACO05]. They haveenhanced Laplacian Surface Editing techniques to work inthe setting of prescribing new silhouettes. In particular, thisrequires positional constraints defined on mesh edges andfinding the correspondence between a pre-selected silhouetteof the mesh and the over-sketched silhouette. In their systemthe user manually selects the ROI and a part of one of thesilhouettes as a pre-process. In our system, all these manualselections are now automated; the user only provides a sin-gle stroke, from which handle and ROI are estimated (Figs. 1and 2).We have also observed that computing silhouettes fromthe mesh representation (i.e. in object-space) has problems:ROIHandleTargetOpHandleTargetOpTargetOpROI ROIHandleFigure 2: Required user interaction (from left to right):Nealen et al. [NSACO05], Kho and Garland [KG05], andour approach .the silhouette path on the mesh might fold onto itself whenprojected to image-space, i.e. a point of the silhouette inimage-space could map to several pieces of the silhouette onthe mesh. As a result, the mapping from the sketch to handlevertices could be ill-defined. More generally, the complex-ity of the silhouette path on the surface is not necessarilyreflected in its image-space projection, making a reasonablemapping from the sketch to vertices on the mesh difficult.Figure 3: Depth map discontinuities, Normal map


View Full Document

UCF CAP 5937 - SilSketch - Automated Sketch-Based Editing of Surface Meshes

Documents in this Course
Load more
Download SilSketch - Automated Sketch-Based Editing of Surface Meshes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view SilSketch - Automated Sketch-Based Editing of Surface Meshes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view SilSketch - Automated Sketch-Based Editing of Surface Meshes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?