DOC PREVIEW
MIT 13 42 - Random Variables and Random Processes

This preview shows page 1-2-3 out of 8 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 8 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 8 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 8 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 8 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

13.42 Reading 5: Random Variables and Random ProcessesSpring 2004c°A.H. Techet & M.S. Triantafyllou1. Gaussian DistributionDistributions of random variables are often gaussian in shape, or can be approximated as such. Thegaussian density function is described by the probability density function(1.1) ℘(x) =1√2πσ2e−(x−x)22σ2which is symmetric aboutx. Given this pdf the cumulative probability of x is(1.2) P (x) =12+ erfµx − xσ¶where erf is the error function:(1.3) erf(ζ) =12πZζ0= e−y2/2dyFor an approximately normal function (with Gaussian distribution) then68% of events fall within 1σ95% of events fall within 2σ97.7% of events fall within 3σ2. Poisson distributionDiscrete events occur randomly in time with the following probability as δt → 0:12(2.1)λδt to have 1 occurence in time interval δt1 − λδt to have 0 occurrences in time δt0 to have more than one occurence in time δtThus the probability to have k occurrences within a finite time t can be shown to be(2.2) P (k in t) = e−λt(λt)kk!This can be useful when designing platforms that require less than k occurrences of an event in a certaintime t (e.g. less than ten times water on deck in one day).3. Random ProcessesA random variable, x(ζ), can be defined from a Random event, ζ, by assigning values xito each possibleoutcome, Ai, of the event. Next define a Random Process, x(ζ, t), a function of both the event and time, byassigning to each outcome of a random event, ζ, a function in time, x1(t), chosen from a set of functions,xi(t).(3.1)A1→ p1→ x1(t)A2→ p2→ x2(t).........An→ pn→ xn(t)This “menu” of functions, xi(t), is called the ensemble (set) of the random process and may contain infinitelymany xi(t), which can be functions of many independent variables.3EXAMPLE: Roll the dice: Outcome is Ai, where i = 1 : 6 is the number on the face of the dice and choosesome function(3.2) xi(t) = tito be the random process.3.1. Averages of a Random Process. Since a random process is a function of time we can find theaverages over some period of time, T , or over a series of events. The calculation of the average and variancein time are different from the calculation of the statistics, or expectations, as discussed in the previously.TIME AVERAGE (Temporal Mean)(3.3) M {xi(t)} =limT →∞1TZT0xi(t) dt =xtTIME VARIANCE (Temporal Variance)(3.4) Vt{xi(t)} =limT →∞1TZT0[xi(t) − M {xi(t)}]2dtTEMPORAL CROSS/AUTO CORRELATION This gives us the “correlation” or similarity in thesignal and its time shifted version.(3.5) Rti(τ) =limT →∞1TZT0[xi(t) − Mt{xi(t)}][xi(t + τ) − Mt{xi(t + τ)}]dt• τ is the correlation variable (time shift).• |Rti| is between 0 and 1.4• If Rtiis large (i.e. Rti(τ) → 1) then xi(t) and xi(t + τ) are “similar”. For example, a sinusoidalfunction is similar to itself delayed by one or more periods.• If Rtiis small then xi(t) and xi(t + τ ) are not similar – for example white noise would result inRti(τ) = 0.EXPECTED VALUE:(3.6) µxt1= E{x(t1)} =Z∞−∞x f(x, t1)dxSTATISTICAL VARIANCE:(3.7) σx2t1= E©[x(t1) − µx(t1)]2ª=Z∞−∞(x − µx)2f(x, t1)dxAUTO-CORRELATION:(3.8) Rxxt1, t2= E{x(t1, ζ)x(t2, ζ)} = E {[x(t1, ζ) − E{x(t1, ζ)}] [x(t2, ζ) − E{x(t2, ζ)}]}Example: Roll the dice: k = 1 : 6 Assign to the event Ak(t) a random process function:(3.9) xk(t) = a cos kωotEvaluate the time statistics:MEAN: Mt{xk(t)} =limT →∞1TRT0a cos kωotdt = 0VARIANCE: Vt{xk(t)} =limT →∞1TRT0a2cos2kωotdt =a22CORRELATION: Rt{xk(t)} =limT →∞1TRT0a2cos(kωot) cos(kωo(t + τ))dt=a22coskωoτ5Looking at the correlation function then we see that if kωot = π/2 then the correlation is zero – for thisexample it would be the same as taking the correlation of a sine with cosine, since cosine is simply the sinefunction phase-shifted by π/2, and cosine and sine are not correlated.Now if we look at the STATISTICS of the random process, for some time t = to,(3.10) xk(ζ, to) = a cos(kωoto) = yk(ζ)where k is the random variable (k = 1, 2, 3, 4, 5, 6) and each event has probability, pi= 1/6.EXPECTED VALUE: E{y(ζ)} =Ppkxk=P6k=116a cos(kωoto)VARIANCE: V {y(ζ)} =P6k=116a2cos2(kωoto)CORRELATION: Ryy(to, τ) = E{yk(to, ζ) yk(to+ τ, ζ)}STATISTICS 6= TIME AVERAGESIn general expected value does not match the time averaged value of a function. The statistics are timedependent whereas the time averages are time independent.4. Stationary Random ProcessesA stationary random process is a random process, X(ζ, t), whose statistics (expected values) are indepen-dent of time. For a stationary random process:µx(t1) = E{x(t1, ζ)} 6= f (t)V (t) = σx2(t1) = E©[x(t1) − µx(t1)]2ª= σx2Rxx(t, τ) = Rxx(τ) =6= f(t)V (t) = R(t, 0) = V 6= f (t)The statistics, or expectations, of a stationary random process are NOT necessarily equal to the timeaverages. However for a stationary random process whose statistics ARE equal to the time averages is saidto be ERGODIC.Example:Take some random process defined by y(t, ζ).y(t, ζ) = a cos(ωot + θ(ζ))(4.1)yi(t) = a cos(ωot + θi)(4.2)6where θ(ζ) is a random variable which lies within the interval 0 to 2π, with a constant, uniform, pdf suchthat(4.3) ℘(θ) =1/2π; for(0 ≤ θ ≤ 2π)0; elseSTATISTICAL AVERAGE: the statistical mean is not a function of time.(4.4) E{y(toζ)} =Z2π012πa cos(ωoto+ θ)dθ = 0STATISTICAL VARIANCE: Variance is also independent of time.(4.5) V (to) = R(τ = 0) =a22STATISTICAL CORRELATION: Correlation is not a function of t, τ is a constant.(4.6) E{y(to, ζ)y(to+ τ, ζ)} = R(to, τ) =Z2π012πa2cos(ωoto+ θ) cos(ωo[to+ τ] + θ)dθ =12a2cos ωoτSince statistics are independent of time this is a stationary process!Let’s next look at the temporal averages for this random process:MEAN (TIME AVERAGE):(4.7) m{y(t, ζi)} =limT →∞1TZT0a cos(ωot + θi)dt =limT →∞1Taωo[sin(ωoT + θi)] = 0TIME VARIANCE:(4.8) Vt= Rt(0) =a22CORRELATION:(4.9) Rt(τ) =limT →∞1TZT0a2cos(ωot + θi) cos(ωo[t + τ] + θi)dt =12a2cos ωoτ7STATISTICS = TIME AVERAGESTherefore the process is ERGODIC!N.B.: This particular random process will be the building block for simulating water waves.5. ERGODIC RANDOM PROCESSESGiven the random process y(t, ζ) it is simplest to assume that its expected value is zero.


View Full Document

MIT 13 42 - Random Variables and Random Processes

Download Random Variables and Random Processes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Random Variables and Random Processes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Random Variables and Random Processes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?