DOC PREVIEW
U of I CS 425 - Lecture notes

This preview shows page 1-2 out of 6 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 6 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

1CS 425: Distributed SystemsIndranil GuptaLecture 28 (Part 1/2)“The Grid”Sample Grid Applications• Astronomers: SETI@Home• Physicists: data from particle colliders• Meteorologists: weather prediction• Bio-informaticians• ….Example: Rapid Atmospheric Modeling System, ColoState U• Weather Prediction is inaccurate• Hurricane Georges, 17 days in Sept 1998• Hurricane Georges, 17 days in Sept 1998– “RAMS modeled the mesoscale convective complex that dropped so much rain, in good agreement with recorded data”– Used 5 km spacing instead of the usual 10 km– Ran on 256+ processorsThe Grid TodaySome are 40Gbps links!(The TeraGrid links)“A parallel Internet”Each location is a cluster2WisconsinMITNCSADistributed ComputingResourcesin GridApplication Coded by a MeteorologistJob 0Job 2Job 1Job 3Output files of Job 0Input to Job 2Output files of Job 2Input to Job 3Jobs 1 and 2 can be concurrentJob 2Output files of Job 0Input to Job 2Output files of Job 2Input to Job 3May take several hours/days4 stages of a jobInitStage inExecuteStage outPublishComputation Intensive, so Massively ParallelSeveral GBsApplication Coded by a MeteorologistWisconsinMITNCSAJob 0Job 2Job 1Job 3Job 0Job 2Job 1Job 3WisconsinMITCondor ProtocolNCSAGlobus ProtocolJob 0Job 2Job 1Job 3WisconsinMITNCSAGlobus ProtocolInternal structure of differentsites transparent to GlobusExternal Allocation & SchedulingStage in & Stage out of Files3Job 0Job 3WisconsinCondor ProtocolInternal Allocation & SchedulingMonitoringDistribution and Publishing of FilesTiered Architecture (OSI 7 layer-like)GlobusHigh energy Physics appse.g., CondorWorkstations, LANsTrends: Technology• Doubling Periods – storage: 12 mos, bandwidth: 9 mos, and (what law is this?) cpu speed/capacity: 18 mos• Then and NowBandwidth– 1985: mostly 56Kbps links nationwide– 2003: 155 Mbps links widespreadDisk capacity– Today’s PCs have 100GBs, same as a 1990 supercomputerTrends: Users• Then and NowBiologists: – 1990: were running small single-molecule simulations – 2003: want to calculate structures of complex macromolecules, want to screen thousands of drug candidatesPhysicists– 2006: CERN’s Large Hadron Collider produced about 10^15 B during the year• Trends in Technology and User Requirements: Independent or Symbiotic?Globus Alliance• Alliance involves U. Illinois Chicago, Argonne National Laboratory, USC-ISI, U. Edinburgh, Swedish Center for Parallel Computers• Activities : research, testbeds, software tools, applications• Globus Toolkit (latest ver – GT4)“The Globus Toolkit includes software services and libraries for resource monitoring, discovery, and management, plus security and file management. Its latest version, GT3, is the first full-scale implementation of new Open Grid Services Architecture (OGSA).”More• Entire community, with multiple conferences, get-togethers (GGF), and projects• Grid Projects:http://www-fp.mcs.anl.gov/~foster/grid-projects• Grid Users: – Today: Core is the physics community (since the Grid originates from the GriPhyN project)– Tomorrow: biologists, large-scale computations (nug30 already)?4PropheciesIn 1965, MIT's Fernando Corbató and the other designers of the Multics operating system envisioned a computer facility operating “like a power company or water company”.Plug your thin client into the computing Utilingand Play your favorite Intensive Compute &Communicate Application– [Will this be a reality with the Grid?]Optional SlidesGrid History – 1990’s• CASA network: linked 4 labs in California and New Mexico– Paul Messina: Massively parallel and vector supercomputers for computational chemistry, climate modeling, etc.• Blanca: linked sites in the Midwest– Charlie Catlett, NCSA: multimedia digital libraries and remote visualization• More testbeds in Germany & Europe than in the US• I-way experiment: linked 11 experimental networks– Tom DeFanti, U. Illinois at Chicago and Rick Stevens, ANL:, for a week in Nov 1995, a national high-speed network infrastructure. 60 application demonstrations, from distributed computing to virtual reality collaboration.• I-Soft: secure sign-on, etc.“We must addressscale & failure”“We need infrastructure”P2P GridDefinitions GridP2P• “Infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational capabilities” (1998)• “Applications that takes advantage of resources at the edges of the Internet” (2000)Definitions GridP2P• “Infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational capabilities” (1998)• “A system that coordinates resources not subject to centralized control, using open, general-purpose protocols to deliver nontrivial QoS” (2002)• “Applications that takes advantage of resources at the edges of the Internet” (2000)• “Decentralized, self-organizing distributed systems, in which all or most communication is symmetric” (2002)5Definitions GridP2P• “Infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational capabilities” (1998)• “A system that coordinates resources not subject to centralized control, using open, general-purpose protocols to deliver nontrivial QoS” (2002)• “Applications that takes advantage of resources at the edges of the Internet” (2000)• “Decentralized, self-organizing distributed systems, in which all or most communication is symmetric” (2002)497ig: (good legal applications without intellectual fodder)497ig: (clever designs without good, legal applications)Grid versus P2P - Pick your favoriteApplicationsGrid• Often complex & involving various combinations of– Data manipulation– Computation– Tele-instrumentation• Wide range of computational models, e.g.– Embarrassingly ||– Tightly coupled – Workflow• Consequence– Complexity often inherent in the application itselfP2P• Some– File sharing– Number crunching– Content distribution– Measurements• Legal Applications?• Consequence– Low ComplexityApplicationsGrid• Often complex & involving various combinations of– Data manipulation– Computation– Tele-instrumentation• Wide range of computational models, e.g.– Embarrassingly ||– Tightly coupled – Workflow• Consequence– Complexity often inherent in the application itselfP2P• Some– File sharing– Number


View Full Document

U of I CS 425 - Lecture notes

Documents in this Course
Lecture 8

Lecture 8

23 pages

TIPS

TIPS

3 pages

The Grid

The Grid

41 pages

Lecture 4

Lecture 4

27 pages

Lecture 4

Lecture 4

20 pages

The Grid

The Grid

41 pages

LECTURE 5

LECTURE 5

25 pages

Multicast

Multicast

23 pages

LECTURE

LECTURE

34 pages

Load more
Download Lecture notes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture notes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture notes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?