DOC PREVIEW
UT Arlington CSE 3302 - Lecture 20 - Concurrency

This preview shows page 1-2-15-16-31-32 out of 32 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 32 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 32 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 32 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 32 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 32 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 32 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 32 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

CSE 3302Lecture 20: Concurrency16 Nov 2010Nate NystromUTACreditDan Grossman, University of Washingtonhttp://www.cs.washington.edu/homes/djg/teachingMaterials/grossmanSPAC_forkJoinFramework.htmlChanging a major assumptionSo far most or all of your study of computer science has assumedOne thing happened at a timeCalled sequential programming – everything part of one sequenceRemoving this assumption creates major challenges & opportunities•Programming: Divide work among threads of execution and coordinate (synchronize) among them• Algorithms: How can parallel activity provide speed-up •! (more throughput: work done per unit time)•Data structures: May need to support concurrent access (multiple threads operating on data at the same time)What to do with multiple processors?Next computer you buy will likely have 4 processors• Wait a few years and it will be 8, 16, 32, …• The chip companies have decided to do this (not a “law”)What can you do with them?• Run multiple totally different programs at the same time• Already do that? Yes, but with time-slicing• Do multiple things at once in one program• Our focus – more difficult• Requires rethinking everything from asymptotic complexity to how to implement data-structure operationsParallelism exampleParallelism: Increasing throughput by using additional computational resources (code running simultaneously)Example in pseudocode (not Java, yet): sum elements of an array• This example is bad style for reasons we’ll see• If you had 4 processors, might get roughly 4x speedupint sum(int[] arr){ res = new int[4]; len = arr.length; FORALL(i=0; i < 4; i++) { //parallel iterations res[i] = help(arr,i*len/4,(i+1)*len/4); } return res[0]+res[1]+res[2]+res[3];}int help(int[] arr, int lo, int hi) { result = 0; for(j=lo; j < hi; j++) result += arr[j]; return result;}Concurrency exampleConcurrency: Allowing simultaneous or interleaved access to shared resources from multiple clientsExample in pseudocode (not Java, yet): chaining hashtable• Essential correctness issue is preventing bad interleavings• Essential performance issue not preventing good concurrencyclass Hashtable<K,V> { … Hashtable(Comparator<K> c, Hasher<K> h) { … }; void insert(K key, V value) { int bucket = …; prevent-other-inserts/lookups in table[bucket]; do the insertion re-enable access to arr[bucket]; } V lookup(K key) {! (like insert, but can allow concurrent ! lookups to same bucket) }}Parallelism vs. concurrencyNote: These terms are not yet standard, but the difference in perspective is essential• Many programmers confuse themParallelism: Use more resources for a faster answerConcurrency: Correctly and efficiently allow simultaneous accessThere is some connection:• Many programmers use threads for both• If parallel computations need access to shared resources, then something needs to manage the concurrencyAn analogyCSE 1320 idea: Writing a program is like writing a recipe for a cook• One cook who does one thing at a time!Parallelism:• Have lots of potatoes to slice? • Hire helpers, hand out potatoes and knives• But not too many chefs or you spend all your time coordinatingConcurrency:• Lots of cooks making different things, but only 4 stove burners• Want to allow simultaneous access to all 4 burners, but not cause spills or incorrect burner settingsShared memoryThe model we will assume is shared memory with explicit threadsOld story: A running program has• One call stack (with each stack frame holding local variables) • One program counter (current statement executing)• Static fields• Objects (created by new) in the heap (nothing to do with heap data structure)New story:• A set of threads, each with its own call stack & program counter•No access to another thread’s local variables• Threads can (implicitly) share static fields / objects• To communicate, write somewhere another thread readsShared memory Threads, each with own unshared call stack and current statement (pc for “program counter”)local variables are numbers/null or heap referencesHeap for all objects and static fieldsOther modelsToday, we will focus on shared memory, but you should know several other models exist and have their own advantagesMessage-passing: Each thread has its own collection of objects. Communication is via explicit messages; language has primitives for sending and receiving them.• Cooks working in separate kitchens, with telephonesDataflow: Programmers write programs in terms of a DAG and a node executes after all of its predecessors in the graph• Cooks wait to be handed results of previous stepsData parallelism: Have primitives for things like “apply function to every element of an array in parallel”…Some Java basicsMany languages/libraries provide primitives for creating threads and synchronizing themWill show you how Java does it• Many primitives will be delayed until we study concurrency• For parallelism, will advocate not using Java’s built-in threads directly, but it’s still worth seeing them firstSteps to creating another thread:•Define a subclass C of java.lang.Thread, overriding run•Create an object of class C•Call that object’s start method•Not run, which would just be a normal method callParallelism ideaExample: Sum elements of an array (presumably large)Use 4 threads, which each sum 1/4 of the arraySteps:• Create 4 thread objects, assigning their portion of the work• Call start() on each thread object to actually run it• Wait for threads to finish• Add together their 4 answers for the final resultFirst attempt at parallelism: wrong!class SumThread extends java.lang.Thread { int lo; // fields to know what to do int hi; int[] arr; int ans = 0; // for communicating result SumThread(int[] a, int l, int h) { lo=l; hi=h; arr=a; } public void run(){ //overriding, must have this type for(int i=lo; i < hi; i++) ans += arr[i]; }}int sum(int[] arr){ int len = arr.length; int ans = 0; SumThread[] ts = new SumThread[4]; for(int i=0; i < 4; i++) // do parallel computations ts[i] = new SumThread(arr,i*len/4,(i+1)*len/4); for(int i=0; i < 4; i++) // combine results ans += ts[i].ans; return ans;}Second attempt (still wrong)class SumThread extends java.lang.Thread { int lo, int hi, int[] arr;//fields to know what to do int ans = 0; // for communicating result SumThread(int[] a, int l, int h) { … } public void


View Full Document

UT Arlington CSE 3302 - Lecture 20 - Concurrency

Documents in this Course
Smalltalk

Smalltalk

11 pages

Syntax

Syntax

5 pages

Syntax

Syntax

5 pages

JAVA

JAVA

57 pages

Semantics

Semantics

41 pages

Control

Control

74 pages

Load more
Download Lecture 20 - Concurrency
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Lecture 20 - Concurrency and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Lecture 20 - Concurrency 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?