6.041/6.431 Probabilistic SystemsAnalysisQuiz II ReviewFall 201011 Probability Density Functions (PDF)For a continuous RV X with PDF fX(x),P (a ≤ X ≤ b)=bafX(x)dxP (X ∈ A)=AfX(x)dxProperties:• Nonnegativity:fX(x) ≥ 0 ∀x• Normalization:∞−∞fX(x)dx =122 PDF InterpretationCaution: fX(x) = P (X = x)• if X is continuous, P (X = x)=0 ∀x!!• fX(x)canbe≥ 1Interpretation: “probability per unit length” for “small” lengthsaround xP (x ≤ X ≤ x + δ) ≈ fX(x)δ33 Mean and variance of a continuous RVE[X]=∞−∞xfX(x)dxVar(X)=E (X −E[X])2=∞−∞(x − E[X])2fX(x)dx= E[X2] − (E[X])2(≥ 0)E[g(X)] =∞−∞g(x)fX(x)dxE[aX + b]=aE[X]+bVar(aX + b)=a2Var(X)44 Cumulative Distribution FunctionsDefinition:FX(x)=P (X ≤ x)monotonically increasing from 0 (at −∞)to1(at+∞).• Continuous RV (CDF is continuous in x):FX(x)=P (X ≤ x)=x−∞fX(t)dtfX(x)=dFXdx(x)• Discrete RV (CDF is piecewise constant):FX(x)=P (X ≤ x)=k≤xpX(k)pX(k)=FX(k) − FX(k − 1)55 Uniform Random VariableIf X is a uniform random variable over the interval [a,b]:fX(x)=⎧⎨⎩1b−aif a ≤ x ≤ b0 otherwiseFX(x)=⎧⎪⎪⎨⎪⎪⎩0ifx ≤ ax−ab−aif a ≤ x ≤ b1 otherwise (x>b)E[X]=b − a2var(X)=(b − a)21266 Exponential Random VariableX is an exponential random variable with parameter λ:fX(x)=⎧⎨⎩λe−λxif x ≥ 00 otherwiseFX(x)=⎧⎨⎩1 − e−λxif x ≥ 00 otherwiseE[X]=1λvar(X)=1λ2Memoryless Property: Given that X>t, X − t is anexponential RV with parameter λ77 Normal/Gaussian Random VariablesGeneral normal RV: N(μ, σ2):fX(x)=1σ√2πe−(x−μ)2/2σ2E[X]=μ, Var(X)=σ2Property: If X ∼ N (μ, σ2)andY = aX + bthen Y ∼ N(aμ + b, a2σ2)88NormalCDFStandard Normal RV: N(0, 1)CDF of standard normal RV Y at y: Φ(y)- given in tables for y ≥ 0-fory<0, use the result: Φ(y)=1− Φ(−y)To evaluate CDF of a general standard normal, express it as afunction of a standard normal:X ∼ N (μ, σ2) ⇔X −μσ∼ N(0, 1)P (X ≤ x)=PX − μσ≤x − μσ=Φx − μσ99JointPDFJoint PDF of two continuous RV X and Y : fX,Y(x, y)P (A)=AfX,Y(x, y)dxdyMarginal pdf: fX(x)=∞−∞fX,Y(x, y)dyE[g(X, Y )] =∞−∞∞−∞g(x, y)fX,Y(x, y)dxdyJoint CDF: FX,Y(x, y)=P (X ≤ x, Y ≤ y)1010 IndependenceBy definition,X, Y independent ⇔ fX,Y(x, y)=fX(x)fY(y) ∀(x, y)If X and Y are independent:• E[XY ]=E[X]E[Y ]• g(X)andh(Y ) are independent• E[g(X)h(Y )] = E[g(X)]E[h(Y )]1111 Conditioning on an eventLet X be a continuous RV and A be an event with P (A) > 0,fX|A(x)=⎧⎨⎩fX(x)P (X∈A)if x ∈ A0 otherwiseP (X ∈ B|X ∈ A)=BfX|A(x)dxE[X|A]=∞−∞xfX|A(x)dxE[g(X)|A]=∞−∞g(x)fX|A(x)dx12If A1,...,Anare disjoint events that form a partition of the samplespace,fX(x)=ni=1P (Ai)fX|Ai(x)(≈ total probability theorem)E[X]=ni=1P (Ai)E[X|Ai] (total expectation theorem)E[g(X)] =ni=1P (Ai)E[g(X)|Ai]1312 Conditioning on a RVX, Y continuous RVfX|Y(x|y)=fX,Y(x, y)fY(y)fX(x)=∞−∞fY(y)fX|Y(x|y)dy (≈ totalprobthm)Conditional Expectation:E[X|Y = y]=∞−∞xfX|Y(x|y)dxE[g(X)|Y = y]=∞−∞g(X)fX|Y(x|y)dxE[g(X, Y )|Y = y]=∞−∞g(x, y)fX|Y(x|y)dx14Total Expectation Theorem:E[X]=∞−∞E[X|Y = y]fY(y)dyE[g(X)] =∞−∞E[g(X)|Y = y]fY(y)dyE[g(X, Y )] =∞−∞E[g(X, Y )|Y = y]fY(y)dy1513 Continuous Bayes’ RuleX, Y continuous RV, N discrete RV, A an event.fX|Y(x|y)=fY |X(y|x)fX(x)fY(y)=fY |X(y|x)fX(x)∞−∞fY |X(y|t)fX(t)dtP (A|Y = y)=P (A)fY |A(y)fY(y)=P (A)fY |A(y)fY |A(y)P (A)+fY |Ac(y)P (Ac)P (N = n|Y = y)=pN(n)fY |N(y|n)fY(y)=pN(n)fY |N(y|n)ipN(i)fY |N(y|i)1614 Derived distributionsDef: PDF of a function of a RV X with known PDF: Y = g(X).Method:• Get the CDF:FY(y)=P (Y ≤ y)=P (g(X) ≤ y)=x|g(x)≤yfX(x)dx• Differentiate: fY(y)=dFYdy(y)Special case:ifY = g(X)=aX + b, fY(y)=1|a|fX(x−ba)1715 ConvolutionW = X + Y ,withX, Y independent.• Discrete case:pW(w)=xpX(x)pY(w − x)• Continuous case:fW(w)=∞−∞fX(x)fY(w − x) dx18Graphical Method:• put the PMFs (or PDFs) on top of each other• flip the PMF (or PDF) of Y• shift the flipped PMF (or PDF) of Y by w• cross-multiply and add (or evaluate the integral)In particular, if X, Y are independent and normal, thenW = X + Y is normal.1916 Law of iterated expectationsE[X|Y = y]=f(y)isanumber.E[X|Y ]=f (Y ) is a random variable(the expectation is taken with respect to X).To compute E[X|Y ], first express E[X|Y = y] as a function of y.Law of iterated expectations:E[X]=E[E[X|Y ]](equality between two real numbers)2017 Law of Total VarianceVar(X|Y ) is a random variable that is a function of Y(the variance is taken with respect to X).To compute Var(X|Y ), first expressVar(X|Y = y)=E[(X − E[X|Y = y])2|Y = y]as a function of y.Law of conditional variances:Var(X)=E[Var(X|Y )] + Var(E[X|Y ])(equality between two real numbers)2118 Sum of a random number of iid RVsN discrete RV, Xii.i.d and independent of N .Y = X1+ ...+ XN. Then:E[Y ]=E[X]E[N ]Var(Y )=E[N]Var(X)+(E[X])2Var(N)2219 Covariance and CorrelationCov(X, Y )=E[(X − E[X])(Y − E[Y ])]= E[XY ] − E[X]E[Y ]• By definition, X, Y are uncorrelated ⇔ Cov(X, Y )=0.• If X, Y independent ⇒ X and Y are uncorrelated. (theconverse is not true)• In general, Var(X+Y)= Var(X)+ Var(Y)+ 2 Cov(X,Y)• If X and Y are uncorrelated, Cov(X,Y)=0 and Var(X+Y)=Var(X)+Var(Y)23Correlation Coefficient: (dimensionless)ρ =Cov(X, Y )σXσY∈ [−1, 1]ρ =0⇔ X and Y are uncorrelated.|ρ| =1⇔ X − E[X]=c[Y − E[Y ]] (linearly
View Full Document