Communication ComplexityInformation Theory: Lossless ComputingInformation Theory: Lossy ComputingDistributed ConsensusDistributed Lossy AveragingConclusionCommuni cation Cost of Distributed ComputingAbbas El GamalStanford UniversityBrice Colloquium, 2009A. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 1 / 41MotivationPerformance of distrib u ted information processing systems oftenlimited by communicationDigital VLSIMulti-processorsData centersPeer-peer networksSensor networksNetworked mobile agentsPurpose of communication is to ma ke decision, compute function,coordinate action based on distributed dataHow much communication is needed to perform such a task?A. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 2 / 41Wireless Video Camera NetworkToday’s surveillance sy stems: Analog; costly; human operatedFuture systems: Digital; networked; self-configuring; automateddetection, e.g., suspicious activity, localizat ion, tracking, . . .Sending all video da ta requires large communication bandwidth, energyA. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 3 / 41Distributed Computing Approach (YSEG 04, EEG 07 )Scan-linesClusterheadLocal processingSubtract backgroundScan-linesA. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 4 / 41Experimental SetupView ofthe setupView froma cameraTop viewof roomScan-linesfrom 16camerasA. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 5 / 41Problem SetupNetwork with m nodes. Node j has data xjNode j wishes to estimate gj(x1, x2, . . . , xm), j = 1, 2, . . . , mNodes communicate and perform local computingNetworkx1x2x3xjxmWhat is the minimum amount of communication needed?A. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 6 / 41Communi cation Cost of Distributed ComputingProblem formulated and studied in several fields:◮Computer science: Communicat ion c omplexity; gossip◮Information theory: Codi ng for computing; CEO problem◮Control: Distributed consensusFormulations differ in:◮Data model◮Type of communication-loca l computing protocol◮Estimation criterion◮Metric for communicati on costA. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 7 / 41Outline1Communication Complexity2Information Theory: Lossless Computing3Information Theory: Lossy Computing4Distributed Consensus5Distributed Lossy Averaging6ConclusionA. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 8 / 41Communication ComplexityCommuni cation Complexity (Yao 79)Two nodes, two-way communicationNode j = 1, 2 has binary k-vector xjBoth nodes wish to compute g(x1, x2)Use round robin communication and local computing protocolx1x2g gWhat is minimum number of b its of communication needed(communication complexity, C (g)) ?A. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 9 / 41Communication ComplexityExample (Equality Function)g(x1, x2) = 1 if x1= x2, and 0, otherwiseUpper bound: C (g) ≤ k + 1 bit sFor k = 2: C (g) ≤ 3 bitsx1x21110010000 01 10 110 0 0 10 0 1 001 0 01 0 0 0Lower bound: Every communication protocol partitions {0, 1}k× {0, 1}kintog-monochromatic rectangles, each having distinct codeIf r(g) is minimum number of such rectangles, then C(g) ≥ log r(g)For k = 2, r(g) = 8 ⇒ C (g) ≥ 3 bitsIn general,C(g) ≥ k + 1 ⇒ C (g) = k + 1 bitsA. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 10 / 41Communication ComplexityNumber-on-Forehead Gamem pl ayers. Player j has binary k-vector xjon her foreheadPlayer j can see all numbers except her own, xjEvery player wants to know if x1= x2= . . . = xmor notPlayers take turns in broadcasting to all other playersRecall for m = 2, C (g) = k + 1What is C(g) f or m ≥ 3?Consider the following protocol:◮Player 1 broadcasts A = 1 if x2= x3= . . . = xm, A = 0, otherwise◮If A = 0, then all players know that values are not equal◮If A = 1 only player 1 d oes not know if all values are equal or not;Player 2 broadcasts B = 1 if x1= x3, B = 0, otherwiseC(g) = 2 b its suffice indep endent of m and k!!Overlap in information can significantly reduce communicationA. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 11 / 41Communication ComplexityMore on Communication ComplexityOther definitions of communication complexity:◮Average: Uniform distribu t ion over node values◮Randomized: Nodes use ”coin flips” in communication◮Non-determini stic: Non-ze ro probability of computing errorSome references:A. Yao, ”Some complexity questions related to distributivecomputing,” STOC, 1979A. Orlitsky, A. El Gamal, ”Average and randomized communicationcomplexity,” IEEE Trans. on Info. Th., 1990R. Gallager, ”Finding parity in a simple broadcast network,” IEEETrans. on Info. Th., 1988E. Kushilevitz and N. Nisan, Communication Complexity, CambridgeUniversity Press, 2006A. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 12 / 41Information Theory: Lossless ComputingInformation Theory: Lossless Computingm node network. Node j observes source XjXjgenerates i.i.d. process {Xj1, Xj2, . . .}, Xji∈ X a finite setSources may be correlated: (X1, X2, . . . , Xm) ∼ p(x1, x2, . . . , xm)Node j wishes to fin d s an estimate ˆgjiof gj(X1i, X2i, . . . , Xmi) foreach sample i = 1, 2, . . .Block coding: Nodes communicate, perform local computing overblocks of n-samplesLossless criterionˆgnj= gnjfor all j ∈ {1, 2, . . . , m} with high probability (whp)What is the minimum sum rate i n bits/sample-tuple?Problem is in general open, even for 3-nodesA. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 13 / 41Information Theory: Lossless ComputingLossless CompressionTwo nodes, one-way communicationSender node observes i.i.d. source X ∼ p(x)Receiver node wishes to losslessly estimateˆXnof XnXnˆXnTheorem (Shannon 1948)The minimum lossless compression rate is the entropy of XH(X ) := − E (log p(X )) bits/sampleExample (Binary Source)X ∼ Bern(p): H(p) := −p log p − (1 − p) log(1 − p) ∈ [0, 1]A. El Gamal (Stanford University) Comm. Cost of Dist. Computing Brice Colloquium, 2009 14 / 41Information Theory: Lossless ComputingRandom binning proof (Cover 1975)Ket facts:If xnis
or
We will never post anything without your permission.
Don't have an account? Sign up