Engineering Analysis ENG 3420 Fall 2009 Dan C Marinescu Office HEC 439 B Office hours Tu Th 11 00 12 00 Lecture 21 Last time Today Relaxation Non linear systems Random variables probability distributions Matlab support for random variables Histograms Linear regression Linear least squares regression Non linear data models Next Time Multiple linear regression General linear squares Lecture 21 2 Statistics built in functions Built in statistics functions for a column vector s mean s median s mode s min s max s Calculate the minimum and maximum value in s var s std s Calculate the mean median and mode of s mode is a part of the statistics toolbox Calculate the variance and standard deviation of s If a matrix is given the statistics will be returned for each column Histograms n x hist s x Determine the number of elements in each bin of data in s x is a vector containing the center values of the bins n x hist s m Determine the number of elements in each bin of data in s using m bins x will contain the centers of the bins The default case is m 10 hist s x or hist s m or hist s With no output arguments hist will actually produce a histogram Histogram Example Linear Least Squares Regression Linear least squares regression is a method to determine the best coefficients in a linear model for given data set Best for least squares regression means minimizing the sum of the squares of the estimate residuals For a straight line model this n n gives 2 2 Sr ei yi a0 a1 xi i 1 i 1 This method will yield a unique line for a given set of data Least Squares Fit of a Straight Line Using the model y a0 a1 x the slope and intercept producing the best fit can be found using a1 n xi yi x y n x x a0 y a1 x 2 i i i 2 i Example V m s F N a1 i xi yi xi xiyi 1 10 25 100 250 2 20 70 400 1400 3 30 380 900 4 40 550 1600 22000 5 50 610 2500 30500 6 60 1220 3600 73200 7 70 830 4900 58100 8 80 1450 6400 116000 360 5135 20400 312850 2 n xi yi x y n x x 2 i i i 2 i 8 312850 360 5135 2 8 20400 360 19 47024 a0 y a1 x 641 875 19 47024 45 234 2857 11400 Fest 234 2857 19 47024v Nonlinear models Linear regression is predicated on the fact that the relationship between the dependent and independent variables is linear this is not always the case Three common examples are exponential y 1e 1 x power y 2 x 2 x saturation growth rate y 3 3 x Linearization of nonlinear models Model Nonlinear Linearized exponential y 1e 1 x ln y ln 1 1 x power y 2 x 2 log y log 2 2 log x x saturation growth rate y 3 3 x 1 1 3 1 y 3 3 x Transformation Examples Linear Regression Program Polynomial least fit squares MATLAB has a built in function polyfit that fits a least squares n th order polynomial to data p polyfit x y n x independent data y dependent data n order of polynomial to fit p coefficients of polynomial f x p1xn p2xn 1 pnx pn 1 MATLAB s polyval command can be used to compute a value using the coefficients y polyval p x Polynomial Regression The least squares procedure from can be extended to fit data to a higher order polynomial The idea is to minimize the sum of the squares of the estimate residuals The figure shows the same data fit with a b A first order polynomial A second order polynomial Process and Measures of Fit For a second order polynomial the best fit would mean minimizing n n 2 2 2 i Sr e yi a0 a1 xi a x 2 i i 1 i 1 In general this would mean minimizing n n m 2 m i Sr e yi a0 a1 xi a x L a x 2 i 2 2 i i 1 i 1 The standard error for fitting an mth order polynomial to n data points is s y x Sr n m 1 because the mth order polynomial has m 1 coefficients The coefficient of determination r2 is still found using St Sr r St 2
View Full Document