DOC PREVIEW
DREXEL ECES 490 - nLectures4&5

This preview shows page 1-2-14-15-30-31 out of 31 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 31 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 31 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 31 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 31 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 31 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 31 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 31 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Telecommunications Networking ISignals in NoiseAnalog Signals in Noise: ExampleAnalog Signals in NoiseSlide 5Slide 6Slide 7Slide 8Slide 9Slide 10Slide 11Gaussian Random ProcessesGaussian Random Processes (cont’d)Slide 14Slide 15Slide 16Slide 17Slide 18ExampleSlide 20Slide 21Slide 22Slide 23Slide 24Slide 25Slide 26Slide 27Matched FilteringSlide 29Slide 30Slide 31Copyright 1998, S.D. Personick. All Rights Reserved.Telecommunications Networking ILectures 4&5Quantifying the Performance of Communication Systems Carrying Analog InformationCopyright 1998, S.D. Personick. All Rights Reserved.Signals in NoiseThe basic problem:informationsignalnoisesignal + noiseinformationCopyright 1998, S.D. Personick. All Rights Reserved.Analog Signals in Noise: ExampleTemperature Sensors= ca (volts), where a=temperature (C)+noiser=s+n =ca+nc= .01 volt/degree-C n(av) =0, var(n) = .0001 volt**2Copyright 1998, S.D. Personick. All Rights Reserved.Analog Signals in NoiseExample (continued)a= a number representing information to be communicated. Apriori, a is a gaussian random variable with variance A, and zero average values = a signal in the form of a voltage proportional to a …. = ca (volts), where c is a known constantr = a received signal with additive noise = s + n (volts)n = a gaussian random variable with variance N (volts**2)How can we estimate “a”, if we receive “r”?Copyright 1998, S.D. Personick. All Rights Reserved.Signals in Noise•What is the criterion for determining whether we’ve done a good job in estimating “a” from the received signal “r”?•It would have something to do with the difference between our estimated value for “a” and the true value for “a”•Example: minimize E{(a-a)**2}, where a is the estimated value of “a”, given rCopyright 1998, S.D. Personick. All Rights Reserved.Signals in Noise•We will give a proof, on the blackboard, that a, the estimate of “a” that minimizes the average value of (a-a)**2 isa(r) = E (a|r) = the “expected value” of “a”, given “r”•The above is true for any probability distributions of “a” and “n”; and for the specific cases given, a(r) = r/c [Ac**2/(Ac**2 +N)]Copyright 1998, S.D. Personick. All Rights Reserved.Signals in NoiseHarder example:a = a gaussian random variable with variance A, representing informations(t) = a signal of duration T (seconds) where s(t) = a c(t), and c(t) is a known waveformr(t) = a received signal = s(t) + n(t), where n(t) is a “random process” having a set of known statistical characteristicsHow do we estimate a, given r(t)?Copyright 1998, S.D. Personick. All Rights Reserved.Signals in Noise•What is the criterion for evaluating how good an estimate of “a” we have derived from r(t)?•How do we describe the noise n(t) in a way that is useful in determining how to estimate “a” from r(t)?Copyright 1998, S.D. Personick. All Rights Reserved.Signals in Noise•Suppose n(t) = nc(t) + x(t) where:n is a gaussian random variable of variance N; and where, in some loosely defined sense:•x(t) is a random process that is statistically independent of the random variable “n”, and where …. (continued on next slide)Copyright 1998, S.D. Personick. All Rights Reserved.Signals in Noise•...x(t) is also “orthogonal” to the known waveform c(t), then we can construct a “hand waiving” argument that suggests that we can ignore x(t), and concentrate on the noise nc(t), as we attempt to estimate the underlying information variable “a”.•To make this argument more precisely requires a deep understanding of the theory of random processesCopyright 1998, S.D. Personick. All Rights Reserved.Signals in Noise•We will show (using the blackboard) that we can convert this problem into the earlier problem of estimating an information parameter “a” from a received signal of the form r= ca + n.•While doing so, we will introduce the concept of a “matched filter”Copyright 1998, S.D. Personick. All Rights Reserved.Gaussian Random Processes•If we look at (“sample”) the random process n(t) at times: t1,t2,t3,…,tj, then we get a set of random variables: n(t1), n(t2), n(t3), …,n(tj).•If the set of random variables {n(tj)} has a joint probability density that is Gaussian, then n(t) is called a Gaussian random processCopyright 1998, S.D. Personick. All Rights Reserved.Gaussian Random Processes (cont’d)•Any linear combination of samples of a Gaussian random process is a Gaussian random variable•Extending the above, the integral of the product n(t)c(t) over a time interval T is also a Gaussian random variable if, n(t) is a Gaussian random process, and c(t) is a known functionCopyright 1998, S.D. Personick. All Rights Reserved.Gaussian Random Processes (cont’d)•Let n(t) be a random process (not necessarily Gaussian)•Define “n” as follows:n= the integral over T of n(t)c(t)/W, whereW= the integral over T of c(t)c(t)•Then, we can write n(t) as follows:n(t)= nc(t) + “the rest of n(t)”Copyright 1998, S.D. Personick. All Rights Reserved.Gaussian Random Processes (cont’d)•If n(t) is a “white, Gaussian random process”, then:-n is a Gaussian random variable, and- “rest of n(t)” is statistically independent of “n”…I.e., “the rest of n(t)” contains no information that can help of estimate either “n” or “a”Copyright 1998, S.D. Personick. All Rights Reserved.Gaussian Random Processes (cont’d)•Furthermore, we can build a correlator that works as follows. It takes the received waveform, r(t), multiplies it by the known waveform c(t), integrates over the time interval T, and finally divides by Wz= {the integral over T of r(t)c(t)}/WCopyright 1998, S.D. Personick. All Rights Reserved.Gaussian Random Processes (cont’d)•Going back to the definitions and equations above, we find thatz= a + n, where “a” is the original information variable we wish to estimate, and n is a Gaussian random variable•Thus by introducing the correlator, we convert the new problem to (continued)Copyright 1998, S.D. Personick. All Rights Reserved.Gaussian Random Processes (cont’d)•…the old problem of estimating a scaler (“a”) from another scaler (“r”); where r= a+n, and where n is a Gaussian random variable•The correlation operation is also known as “matched filtering”, because it can be accomplished by passing r(t) through a filter whose impulse response is c(-t).Copyright 1998, S.D.


View Full Document

DREXEL ECES 490 - nLectures4&5

Download nLectures4&5
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view nLectures4&5 and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view nLectures4&5 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?