Could not display document, please try refreshing this page a few times.

Please contact support if you are unable to view this document.

## Marginal Distributions

IV. Marginal Distributions V. Expectations and Variance VI. Covariance

- Lecture number:
- 2
- Pages:
- 2
- Type:
- Lecture Note
- School:
- Cornell University
- Course:
- Econ 3120 - Applied Econometrics
- Edition:
- 1

**Unformatted text preview: **

Lecture 2 Outline of Last Lecture I. Random Variables II. Continuous Distributions III. Multivariate Distributions Outline of Current Lecture IV. Marginal Distributions V. Expectations and Variance VI. Covariance Current Lecture 1.4 Marginal Distributions Suppose we have a joint distribution of X and Y given by f(x, y). How do we find the distribution of Y alone? The marginal distribution of a discrete random variable X is given by g(x) = ∑ y f(x, y) Similarly, the marginal distribution of Y is given by h(y) = ∑x f(x, y) In the continuous case, the marginal distributions for X and Y are given by g(x) = ˆ ∞ −∞ f(x, y)dy and h(y) = ˆ ∞ −∞ f(x, y)dx 1.5 Conditional Distributions The conditional distribution of X is defined as: f(x|y) = f(x, y) h(y) when h(y) is the value of the marginal distribution of Y at y. The conditional distribution of Y is defined similarly. Given the marginal distribution for X of g(x), the conditional distribution is given by: w(y|x) = f(x, y) g(x) 1.6 Independence Random variables X and Y are independent if and only if f(x, y) = f(x)· f(y) f(x|y) = g(x) Expectation and Variance 2.1 Expectation Definition: If X is a discrete random variable, and f(x) represents its probability distrubtion function, the expected value (or mean) of X is given by E(X) = µx = ∑x x · f(x) For a continuous random variable, the expected value is given by E(X) = µx = ˆ ∞ −∞ x · f(x)dx We can think about the expected value as the weighted average of X. The value of each possible realization of X is weighted by the probability that x occurs. We can also take expectations of functions of a random variable: E(w(X)) = µx = ˆ ∞ −∞ w(x)· f(x)dx Or, we can take expectations of functions of multiple random variables: E(w(X,Y)) = µx = ˆ ∞ −∞ w(x, y)· f(x, y)dxdy 2.1.1 Properties of Expectations 1. For any constant c, E(c) = c 2. For any constants a and b, E(aX +b) = aE(X) +b 3. If a1,a2,...,an are constants and {X1,X2,...,Xn} are random variables, then E( n ∑ i=1 aiXi) = ∑aiE(Xi) 2.2 Variance While the expectation tells us about the average of X, the variance gives us information on the dispersion of X around the mean. The variance is defined as Var(X) = σ 2 x = E[(X −E(X))2 ] It turns out that this is equivalent to Var(X) = E(X 2 )−(E(X))2 The variance is the average squared deviation from the mean. We often think about dispersion in terms of standard deviation, which is defined as sd(X) = σx = p Econ 3120 1st Edition

View Full Document