UMBC CMSC 635 - Image Analogies with Patch Based Texture Synthesis

Unformatted text preview:

Image Analogies with Patch Based Texture SynthesisPatrick GillespieAbstractIn this paper we introduce a simple new approach to image analogies using patch based texture synthesis. The patch based method takes the place of the pixel based method that formed the foundation of the original image analogy algorithm. The new method is able to generate reasonable output for certain classes of images and is able do it in a speedy manner.1 IntroductionAn analogy helps us figure out the relationship between a set of objects, given that we understand the relationship between a different set of objects that have a similar type of relationship. If we know B relates B’ the same way A relates A’, and we understand the relationship between A and A’, we are able to deduce the relationship between B and B’. This simple reasoning process is extremely powerful and is handy tool in understanding. For notation proposes, analogies can be expressed as follows:A : A’ :: B : B’An image analogy is an analogy between two pairs of images. Image analogies can be advantageous in that they allow us to easily learn and apply image transformations. They have the potential of saving one the time of manually programming many different image filters or programming new image filters once a different image transformation is developed.Tools and methods for creating image analogies have been created, studied, and have produced quite amazing work [Hertzmann et al. 2001]. However, the existing method for creating image analogies can take minutes to hours to compute in certain circumstances. Thus we explore an alternative way of creating them, using some of the latest work in texture synthesis. Texture Synthesis is the process of taking a small sample of a texture and generating more of it. For example, in Figure 1, given the input image A, a good texture synthesis algorithm should be able to generate more of the texture to create an image like B. Figure 1: Texture Synthesis ExampleThe above example was created through Graph Cut texture synthesis [2001] which uses patches of texture to create its output image. In this paper we attempt to integrate this work along with other patch based texture synthesis techniques to help accelerate the creation of image analogies. We explain a simple algorithm that one can use to create them.2 Previous WorkImage analogies were originally developed by Hertzmann et al. [2001]. The algorithm proposed produces extremely good results in a number of different circumstances and has a number of different applications including, but not limited to, traditional image filters, artistic filters, super-resolution, texture by numbers, texture transfer, and improved texture synthesis. It has also been shown to work well for image colorization [Welsh et al. 2002]. The algorithm takes in 3 input images: An unfiltered source image A, and filtered target image A’, an unfiltered source image B, and produces a filtered target image B’. Building upon the texture synthesis work of Wei and Levoy [2000] and Ashikhmin [2001], it uses a multiscale representation of all the images, refining the output image B’ in each successive level. As in the Wei and Levoy [2000] and Ashikhmin [2001] texture synthesis algorithms, thealgorithm works by constructing the image pixel by pixel. This attention to detail leads to the algorithms only main drawback, which is that it takes a reasonable amount of time to complete. The time of execution varies depending on the input and the desired goal. They note that on a 1GHz PC that their algorithm’s execution time can take anywhere from a few seconds for texture synthesis to a few hours for artistic renderings. Since the publication of the Image Analogies paper, faster methods of texture synthesis have been developed. These methods usually build their output images with patches of texture as opposed to building them pixel by pixel at many different resolutions. Image Quilting [Efros and Freeman 2001] is one such method. In Image Quilting, the output image is generated block by block in raster scan order. New blocks are placed next to old blocks with an over lap of 1/6 of the block size. A dynamic programming algorithm is then used to determine which pixels from the new block will show up in the overlap region in the output image. This region of pixels forms a seam between the new block and the rest of the picture. If the block is a good match, the seam will barely, if at all, be noticeable.Blocks being placed into the texture are all of the same size. A group of possible blocks from the input texture are selected based on how well their overlap regions match the region on the output image that they would overlap. One of these blocks is randomly chosen to be the next block in the texture. Graph Cut Texture Synthesis [Kwatra et al. 2003] is another method of texture synthesis that works with patches of texture. Instead of using a dynamic programming algorithm to determine the best cut between two images that are overlapping, it uses a min cut [Ford and Fulkerson 1962] algorithm to determine the optimal cut between two images. The overlapping region between two patches, lets say A and B, is set up as a graph of nodes where each pixel in the overlap is represented by a node. Nodes along the border next to a patch link back to a single node that represents that patch. This node is either the source or the sink in the min cut algorithm. Nodes that are adjacent to each other in the graph have arcs between them that are weighted based on the following equation:M(s, t, A, B) = ||A(s) – B(s)|| + ||A(t) – B(t)||Here s and t are adjacent pixel positions and A(s) represents the color of pixel s in patch A, and B(t) represents the color of pixel t in patch B. After the graph is set up, running the min cut algorithm will yield the optimum cut between the two patches. Extremely fast min cut algorithms and implementations have been developed [Boykov et al. 1999]. Figure 3 gives an example of the set up for finding the optimal cut. Nodes bordering the A and B patches link back to it and all adjacent


View Full Document

UMBC CMSC 635 - Image Analogies with Patch Based Texture Synthesis

Download Image Analogies with Patch Based Texture Synthesis
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Image Analogies with Patch Based Texture Synthesis and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Image Analogies with Patch Based Texture Synthesis 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?