UW-Madison ME 964 - The GPGPU Creation Story (40 pages)

Previewing pages 1, 2, 3, 19, 20, 38, 39, 40 of 40 page document View the full content.
View Full Document

The GPGPU Creation Story



Previewing pages 1, 2, 3, 19, 20, 38, 39, 40 of actual document.

View the full content.
View Full Document
View Full Document

The GPGPU Creation Story

95 views


Pages:
40
School:
University of Wisconsin, Madison
Course:
Me 964 - High Performance Computing for Engineering Applications
High Performance Computing for Engineering Applications Documents
Unformatted text preview:

The GPGPU Creation Story Mikola Lysenko University of Wisconsin Madison Dept of Mechanical Engineering Spatial Automation Lab Overview Going to cover major historical developments in roughly chronological order Time period 1970 present Focus on History Key technologies Market forces Really a story about PC games The Early Graphics Displays Vector graphics 1960s Based on the oscilloscope Consist of electron gun phosphor display Examples SketchPad Asteroids Pong DutchTronix 2007 Problems with Vector Displays No color Though you can put a filter over part of the screen Limited number of objects Crude images Will always look like lines Atari Inc 1979 Raster Displays Represent an image with a framebuffer A 2D array of pixels Requires a RAMDAC Random Access Memory Digital to Analog Converter Translates the framebuffer into a video signal Can be used to display any image in theory Early Raster Games Sprite based rendering late 1980s early 1990s Blit rectangular subregions quickly to animate objects Later games added scrolling backgrounds Examples Duke Nukem Commander Keen Apogee 1991 id Software 1990 Ray Casting Games Enabled by raster displays and faster CPUs Render a column of pixels per ray Examples Wolfenstein 1992 Doom 1995 id Software 1992 id Software 1993 Problems with Ray Casting 3D is an illusion Enemies objects are sprites World is limited to 2D geometry Though Doom had height Apogee 1994 Polygons Pioneered by SGI 1980s Basic idea is to represent 3D objects by a collection of polygons on their boundary Made famous in games by Quake 1996 Basis for all modern 3D graphics id Software 1996 Polygon Rasterization 1 Trace edges of polygon into 2 buffers LEFT RIGHT LEFT RIGHT Polygon Rasterization cont 2 Traverse LEFT RIGHT scan lines in frame buffer LEFT RIGHT Polygon Rasterization final Fill in the scan lines That s it Problems We need to draw a lot of polygons to get a good approximation of a 3D scene Drawing polygons is expensive Quake was able to get around this with very aggressive visibility culling CPU performance was not scaling fast enough First 3D Hardware Originally developed for Hollywood and industrial academic applications First machines built by SGI Iris late 1980s Brought to mass market by 3Dfx 1996 id Software 1998 How the First GPUs Worked Use PCI slot hijacked video signal 3Dfx Interactive 1996 Could not do 2D and 3D simultaneously No windowed mode 3D Only drew 1 triangle at a time Further Evolution Originally PCI was fast enough The GPU could barely keep up But triangle rendering got faster Needed more bandwidth to use this speed Made new AGP slot Then AGP 2X 4X 8X Then PCI E PCI Ex16 Asymptotically this is a losing battle Hardware Caching On board RAM was getting cheaper Take some of the texture memory and use it to store vertices Then the CPU can draw many polygons with a single command saving a lot of bandwidth BUT only works for static objects What about animated characters Vertex Programs Allow programmers to parameterize animations GPU runs a small user defined program for each vertex Can generate an arbitrary animated shape from a few parameters and some cached memory How Vertex Programs Work Vertices and topology are stored on the GPU Each vertex is a collection of user defined values called attributes e g Position Color Texture Coordinates etc May also pass uniform values that are constant for all shaders e g Transform matrix animation coordinates etc Attributes Topology Uniforms Vertex Program Triangles Example Skinning Skeletal Animation Parameterize animations for a character in terms of a collection of bones Vertices form a skin Attributes Relative position Bone weights Color Texture Martinez Ontology for Virtual Humans 2007 Skinning cont To draw a character call the vertex program with bone positions passed as uniforms Vertex program interpolates vertices Bandwidth savings Thousands of vertices each with 30 bytes of data vs A handful of bones 16 bytes each Skinning Results More characters more polygons Faster rendering Flexible animation Can do ragdolls Win win situation Croteam 2001 Rendering Quality Over time people began to look for ways to add more detail to scenes beyond simple texturing Alternate Lighting Models Bump Mapping Shadows Environment Mapping These techniques produced higher quality images but required special GPU features Proliferation of Features To keep up hardware vendors kept adding more gizmos New updates almost every month A programmer s nightmare Impossible to take advantage of all the new features Bad for coders Bad for gamers Bad for video card manufacturers Fragment Shaders Make per pixel color calculations programmable Reduces interface complexity Opened up brand new rendering techniques Implementing features no longer vendor s problem Far too many to even begin covering in this talk Dramatic improvements in image quality Where does the term Shader come from Shaders were invented by Pixar for their Renderman platform 1982 Meant to help artists specify lighting textures Used in Toy Story Pixar 1995 Fragment Shader Examples Decaudin Cartoon Looking Rendering of 3D Scenes 1996 Tatarchuk Practical Dynamic Parallax Occlusion Mapping 2005 Heidrich Seidel Realistic Hardware Accelerated Shading and Lighting 1999 Gosselin Phat Lewt Drawing a Diamond 2004 Lengyel et al Realtime Fur Over Arbitrary Surfaces 2001 D Eon NVIDIA Human Head Demo 2007 GPU Shaders Today Can create beautiful lifelike imagery Requires massive computing power Fast memory many processors Bethesda Softworks 2007 Exponential Growth of GPU Power To keep up with demands from gaming GPUs grew very powerful Source linux mag com The Birth of GPGPU GPUs had become very powerful Parallel execution for processing vertices fragments Massive memory bandwidth for texture lookups Can it be used for something besides graphics Yes Pixel Shaders PBOs are Turing complete How does it work The Ping Pong Trick Encode data in texture memory Render full screen quad into PBO Swap buffers Repeat GPU program is nothing more than a perverted lighting calculation The Animated Ping Pong BUFFER 1 DRAW QUAD BUFFER 2 DRAW QUAD BUFFER 1 Early GPGPU Basic physics simulation Linear algebra Radiosity Photon Mapping Coombe et al 2004 Radiosity on Graphics Hardware Purcell et al 2003 Photon Mapping on Programmable Graphics Hardware Harris et al 2002 Physically Based Visual Simulation on Graphics Hardware Problems With Ping Ponging Code is really obfuscated Need to split single algorithm across


View Full Document

Access the best Study Guides, Lecture Notes and Practice Exams

Loading Unlocking...
Login

Join to view The GPGPU Creation Story and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view The GPGPU Creation Story and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?