DOC PREVIEW
Berkeley A,RESEC 210 - Technical Notes on Characteristic Functions

This preview shows page 1-2-3 out of 10 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 10 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Introduction to Asymptotic Theory ARE 210 page 1 Technical Notes on Characteristic Functions The characteristic function of a random variable X is defined as () ()tXxtEeιϕ= , ,t−∞< <∞ 1ι=−. The function ( )xtϕ is finite for all random variables X and all real numbers t. The distri-bution function of X, and the density function when it exists, can be obtained from the characteristic function by means of an inversion formula. Facts involving complex variables 1. We can write any complex number z as zx y=+ι , where x and y are real numbers. 2. The modulus (absolute value) ||z of a complex number is 22||zxy=+ . 3. The distance between any two complex numbers 1z and 2z is 2212 12 1 2()( )zz xx yy−= − +− . 4. When a function of a real variable has a power series expansion with a positive radius of convergence, we can use that power series to define a corresponding function of a complex variable. Thus we define 0!nznzen∞==∑ for any complex number z. A direct calculation verifies that the relation 12 12zz zzeee+= is valid for all complex numbers, 1z and 2z . Letting zt=ι , where t is a real number, ()0!ntnten∞ι=ι=∑Introduction to Asymptotic Theory ARE 210 page 2 23451 ...2! 3! 4! 5!tttttιι=+ι−− ++ − 24 351 ... ...2! 4! 3! 5!tt ttt=−+− +ι−+− Recall the McLauren series expansion for the sine and cosine functions, 24cos( ) 1 ...2! 4!ttt=−+− and 35sin( ) ...3! 5!tttt=−+− From these, we apply deMoivre’s theorem to write cos( ) sin( )tettι=+ι. This is very helpful because it allows us to make use of some trigonometric identities and the relationships between exponential functions. We have cos( ) cos( )tt−= , and sin( ) sin( )tt−=− , so that by simply changing the signs of the above series expansions, we get cos( ) sin( )tett−ι=−ι. As a result, if we add teι and te−ι, we obtain the identity ()cos( ) ½ttteeι−ι=+, while if we subtract te−ι from teι, we obtain the identity ()sin( ) ½ ½tttteeteeι−ιι−ι−==−ι−ι.Introduction to Asymptotic Theory ARE 210 page 3 The second equality uses ()2111 1 1ι= − =− − − =−ι. Finally, we obtain the modulus of teι as 22cos ( ) sin ( ) 1tetttι=+≡∀∈\ , so that, as a function of t, teι is smooth and absolutely bounded on [-1, +1]. 5. If ( )ft and ( )gt are real-valued functions of t , then () () ()ht f t gt=+ι defines a complex-valued function of t . 6. We can differentiate ( )ht by differentiating ( )ft and ( )gt separately, () () ()ht f t gt′′′=+ι so long as ( )ft′ and ( )gt′ exist. 7. Similarly, we define () () ()bb baa ah t dt f t dt g t dt=+ι∫∫ ∫ so long as the integrals for f and g exist. 8. The formula ctctdecedt= is valid for any complex constant c . 9. The fundamental theorem of calculus holds, and in particular, if c is a nonzero com-plex constant, then cb cabctaeeedtc−=∫. 10. A complex-valued random variable Z can be written in the form ZXY=+ι, whereIntroduction to Asymptotic Theory ARE 210 page 4 X and Y are real-valued variables. Its expectation ()EZ is defined as ()()()()EZ EX Y EX EY=+ι= +ι. 11. The random variable Z has a finite expectation if and only if ()EZ<∞, and in such a case, ()()EZ E Z≤ . 12. The formula ()()()11 2 2 1 1 2 2EaZ aZ aEZ aEZ+= + is valid whenever 1a and 2a are complex constants and 1Z and 2Z are complex-valued random variables each pos-sessing a finite expectation. 13. Suppose that X is a random variable and t is a real constant. Then 1||tXeι= , so that tXeι has a finite expectation and the characteristic function ( )xttϕ∀∈\ , given by ()()tXxtEe tιϕ= ∀∈\ , is well–defined (exists), and satisfies 0(0) (1) 1()xEe Eϕ= = =, ()() (1) 1()tX tXXtEe Ee E tιιϕ= ≤ ==∀∈\ . 14. The reason characteristic functions are finite ∀ t , while moment generating functions may not be, is that teι is bounded t∀∈\ , while te is unbounded 0t∀≠. 15. Suppose X and Y are independent random variables. Then tXeι and tYeι are inde-pendent random variables and ()()()() () ()tX YtX tYXY X YtEe EeEe t ttι+ιι+ϕ= = =ϕϕ∀∈\ ,Introduction to Asymptotic Theory ARE 210 page 5 so that the characteristic function of the sum of a finite number of independent ran-dom variables is the product of the individual characteristic functions. 16. ( )Xtϕ is a continuous function of t , and if X has a finite nth moment, then ()()()()nnnntX tX tXXnnddtEeEeEXedt dtιι ιϕ= = =ι. In particular, ()()(0)nnnXEXϕ=ι . 17. Suppose that 0()()!nnXnEXMttn∞==∑ is finite on 00ttt−<< for some positive number 0t . Then the power series expansion of ( )Xtϕ, ()00()()!!()nnn ntXXnntXtEXtEe Enn∞∞ι==ιιϕ= = =∑∑, also holds on 00ttt−<<. 18. Let X be a random variable whose moment generating function ( )XMt is finite on 00ttt−<< for some 0t . If we replace t by tι in the formula for the moment generat-ing function, we will obtain the corresponding formula for the characteristic function, ()() ()tXXXMtEe tιι= =ϕ . Inversion Theorems: 1. Suppose that Xbe an integer-valued random variable. Then its characteristic function is given by () ( )jtXXjtefj∞ι=−∞ϕ=∑.Introduction to Asymptotic Theory ARE 210 page 6 One of the most powerful properties of ( )Xtϕ is that it can be used to calculate ( )Xfk . Specifically, we have the inversion formula 1() ()2ktXXfketdtπ−ι−π=ϕπ∫. 2. Let 12, ,...,nXXX be independent, identically distributed integer-valued random vari-ables and let 12...nnSXX X=+++. Then 11() ()2nnktsXfketdtπ−ι−π=ϕπ∫. 3. Let X be a continuous random variable whose characteristic function ( )Xtϕ is inte-grable, i.e., ()Xtdt∞−∞ϕ<∞∫; then the probability density function for X is given by 1() ()2xtXXfxetdt∞−ι−∞=ϕπ∫. Uniqueness Theorems: If two random variables have the same characteristic function, then they also have the same distribution function. Convergence Theorem: Let nX, 1n ≥ and Xbe random variables such that lim ( ) ( )nXXnFxFx→∞= at all points where XF is continuous. Convergence of characteristic functions implies convergence of the corresponding distribution functions. Continuity Theorem: Distribution


View Full Document

Berkeley A,RESEC 210 - Technical Notes on Characteristic Functions

Download Technical Notes on Characteristic Functions
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Technical Notes on Characteristic Functions and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Technical Notes on Characteristic Functions 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?