Slide 1Continuous Random variablesDistributions: GammaDistributions: GammaDistributions: GammaDistributions: Gamma-PoissonDistributions: Gamma-PoissonDistributions: normalCentral limit theoremCentral limit theoremPROBABILITY AND STATISTICS IN COMPUTER SCIENCE AND SOFTWARE ENGINEERING Chapters 4: Continuous Distributions1CONTINUOUS RANDOM VARIABLESWe’ve been talking about continuous random variables – random variables that can take on a continuous set of values, not a discrete set of valuesThe continuous random variable has a Probability Mass Function (pmf), f(x), and a Cumulative Distribution Function (cdf) F(x)We saw formulas for expectation (generally, the center of gravity of the pmf) and variance (how “spread out” the distribution is), along with covariance and correlationThese were sums for discrete random variables, they are integrals for continuous random variablesWe saw some examples of pmfs (Distributions) – Uniform, Exponential2DISTRIBUTIONS: GAMMAThis distribution is helpful when modeling processes that have several steps, each which takes Exponentially distributed amount of time …The distribution has two parameters: (number of independent steps) (frequency parameter for exponential times of each step)Example: Visitors arrive at a website with a frequency of per minute. Every 200th visitor sees a special promotion. The function describing the distribution (waiting times for the promotion to appear) is a Gamma distribution with . •<3DISTRIBUTIONS: GAMMAThe density function for this distribution is given by , .The denominator is the “Gamma Function” we discussed earlierThis distribution can be used for other models, not just multi-stage exponential times is usually referred to as the shape parameter for this reasonWhen , we have the usual exponential distribution with frequency When , we have what is called the Chi-Squared distribution (more on this later)•<4DISTRIBUTIONS: GAMMAFormulas for expectation and variance for the Gamma distribution are shown on pages 85-86In general, for the Gamma Distribution,So for the website example we just saw with , we would have an expectation of 66.67(minutes) with a variance of 22.22 (minutes squared). The standard deviation is 4.71 minutes. For an example application of this distribution, see example 4.7 on page 87•<5DISTRIBUTIONS: GAMMA-POISSONWe can think of the Gamma distribution with integer value and positive as a probability distribution for the time of the -th rare eventAs in Example 4.7, we can define a continuous variable T that describes time of the -th rare event, and use the Gamma distribution to calculate probabilities for values this variable may take onWe note that this is the same thing as computing the probability that the -th rare event occurs at time T, which is given by the Poisson distribution. So we can relate Gamma and Poisson distributions as follows: Define X to be the discrete variable that represents the number of rare events occurring before time t. •<6DISTRIBUTIONS: GAMMA-POISSONWe then have The left side is Gamma distribution, the right side is Poisson with parameter …Note we have scaled the frequency parameter for PoissonWe can also do the complement, This makes the computation of Gamma distributed probabilities much easier – see examples 4.8, 4.9 on page 88•<7DISTRIBUTIONS: NORMALThe Normal distribution is extremely important and widely used – it models averages, errors, sums, but we will also see this in samplingThe distribution function is given by, where is the expectation and is the standard deviation. Here the continuous variable x may take on any value.It is sometimes useful to scale the variable; when we have the Standard Normal distribution; the variable is usually denoted by Z.Translating to and from the standard normal can be done using the formulas on page 90The three examples on page 91 give a good overview of working with this distribution•<8CENTRAL LIMIT THEOREMThe Central Limit Theorem is extremely useful – we will use this extensively when we get to statisticsBasically, if we have a set of independent random variables which all have the same expectation and standard deviation, then the sumwill be approximately normally distribute for large nThe sums an also be normalized, as is shown on page 93.The two examples on page 93 show how this theorem can be used•<9CENTRAL LIMIT THEOREMWe saw that when the are Bernoulli variables with probability p, the sum is a Binomial variableFor small values of p we could approximate the Binomial distribution with a Poisson distributionFor moderate values of p and large n, we can approximate the Binomial distribution with a Normal distribution with Notice that Binomial is a discrete distribution while Normal is a continuous distribution …This implies the need for a continuity correction … see pages 94-95Example 4.15 shows how this can be
View Full Document