OutlineConcurrency OverviewLanguage SupportShared Memory and Message PassingSynchronizationCS421 Lecture 22: Concurrency1Mark [email protected] of Illinois at Urbana-ChampaignAugust 3, 20061Based on slides by Mattox Beckman, as updated by Vikram Adve, GulAgha, and Elsa GunterMark Hills CS421 Lecture 22: ConcurrencyOutlineConcurrency OverviewLanguage SupportShared Memory and Message PassingSynchronizationConcurrency OverviewLanguage Supp ortShared Memory and Message PassingSynchronizationMark Hills CS421 Lecture 22: ConcurrencyOutlineConcurrency OverviewLanguage SupportShared Memory and Message PassingSynchronizationConcurrency: Terminology◮A program where two or more execution contexts may beactive (running) at once is concurrent.◮A concurrent program where execution i s actually occuring inmultiple contexts at once i s is parallel.◮A concurrent program with execution contexts running onmultiple nodes is distributed.◮An execution c ontext in a program is a thread.Unfortunately, this terminology (especially thread) varies withdifferent languages and systems.Mark Hills CS421 Lecture 22: ConcurrencyOutlineConcurrency OverviewLanguage SupportShared Memory and Message PassingSynchronizationOptions in ConcurrencyIn a l anguage with concurrency support, we have many designoptions. These are the big three, but they le ad to others...◮Library or Language: Is support for concurrency baked directlyinto the l anguage, or is it provided by system libraries?◮Shared Memory or Message Passing: Do threads communicateby reading and writi ng shared areas of memory, or by passingmessages back and forth?◮Synchronization: How do threads make sure that operationsoccur in the desired order?Mark Hills CS421 Lecture 22: ConcurrencyOutlineConcurrency OverviewLanguage SupportShared Memory and Message PassingSynchronizationUsing LibrariesLibraries are still the most common way to support concurrency.◮For shared memory systems, pthreads is the most commonoption◮Vendors often provide their own thread packages as well(Microsoft has similar support for Windows, for instance)◮For message passing, PVM and MPI are both very popular,with (my opinion) MPI probably moreso nowMark Hills CS421 Lecture 22: ConcurrencyOutlineConcurrency OverviewLanguage SupportShared Memory and Message PassingSynchronizationLanguage SupportLanguages with direct support for concurrency have provided manydifferent constructs t o represent concurrent operations.◮co-begin◮parallel loops◮launch-at-elaboration◮fork/join◮implicit receipt◮early replyMark Hills CS421 Lecture 22: ConcurrencyOutlineConcurrency OverviewLanguage SupportShared Memory and Message PassingSynchronizationCo-beginA co-begin c onstruct allows a standard block to be executedeither sequentially or in parallel, based on the chosen keywords.From Algol 68:1 # from Algol 68 #2 par begin3 p(a, b, c),4 begin5 d := q(e, f);6 r(d, g, h)7 end,8 s(i, j)9 endThe comma-separated statements execute in parallel, but theinternal begin is sequentialized.Mark Hills CS421 Lecture 22: ConcurrencyOutlineConcurrency OverviewLanguage SupportShared Memory and Message PassingSynchronizationParallel LoopsParallel loops are available in some languages to allow eachiteration to be executed in parallel. Some optimizing compilers alsotransform loops automaticall y into forms that can be executed inparallel.1 # from the SR language2 co (i := 5 to 10) ->3 p(a, b, i) # six instances of p4 ocMark Hills CS421 Lecture 22: ConcurrencyOutlineConcurrency OverviewLanguage SupportShared Memory and Message PassingSynchronizationLaunch-at-ElaborationHere, the thread co de is decl ared in syntax similar to that forsubroutines. The thread is created to execute the contained codewhen the de claration is elaborated at runtime.1 procedure P is2 task T is3 ...4 end T;5 begin6 ...7 end P;Task T needs information in procedure P, so P will wait for T tofinish. Many instances of T can run at once – think recursion...Mark Hills CS421 Lecture 22: ConcurrencyOutlineConcurrency OverviewLanguage SupportShared Memory and Message PassingSynchronizationFork/JoinInstead of t hreads being worked in to the control flow, they canalso b e explicitly defined and created. In Ada:1 task type T is2 ...3 begin4 ...5 end T;6 ...7 pt : access T := new T;Mark Hills CS421 Lecture 22: ConcurrencyOutlineConcurrency OverviewLanguage SupportShared Memory and Message PassingSynchronizationFork/Join in Java1 class CClass extends Thread {2 ...3 CClass(...) {4 // constructor5 }6 ...7 public void run() {8 // code that runs in the thread9 }10 }11 ...12 CClass myclass = new CClass(...);13 myclass.start();14 ...15 myclass.join();Mark Hills CS421 Lecture 22: ConcurrencyOutlineConcurrency OverviewLanguage SupportShared Memory and Message PassingSynchronizationShared MemoryShared memory systems allow threads to communicate by updatingshared components of program memory.◮Advantage: easy, familiar programming model◮Disadvantage: leads to race conditions and a nee d forexplicit synchronization◮Disadvantage: not scalableMark Hills CS421 Lecture 22: ConcurrencyOutlineConcurrency OverviewLanguage SupportShared Memory and Message PassingSynchronizationMessage PassingMessage passing systems all threads to communicate by sendingmessages to one another. Messages generall y come in two forms:◮Synchronous Messages: One thread sends a message t oanother and waits for the message to be received; a variant(RPC) actually waits for a reply as well◮Asynchronous Messages: One thread sends a message toanother and then continues computationSynchronous is probably more common, and maybe (but notalways) makes more sense for threads communicating within aprocess or on the same machine. Asynchronous makes more sensefor many (but not all) distri buted computations.Mark Hills CS421 Lecture 22: ConcurrencyOutlineConcurrency OverviewLanguage SupportShared Memory and Message PassingSynchronizationSynchronous Message PassingMany languages use a synchronous message passing mo del –Concurrent ML, plus by default with most RPC and RMImechanisms.◮Advantage: allows direct synchronization on events betweenthreads◮Advantage: provides a sense of “shared state”; the threadsknow something about where they each are in a computation◮Disadvantage: does not handle network problems, unreliableor congested threads wellMark Hills CS421 Lecture 22: ConcurrencyOutlineConcurrency OverviewLanguage SupportShared Memory and Message
View Full Document