Engineering Analysis ENG 3420 Fall 2009Lecture 21Statistics built-in functionsHistogramsHistogram ExampleLinear Least-Squares RegressionLeast-Squares Fit of a Straight LineExampleNonlinear modelsLinearization of nonlinear modelsTransformation ExamplesLinear Regression ProgramPolynomial least-fit squaresPolynomial RegressionProcess and Measures of FitEngineering Analysis ENG 3420 Fall 2009Dan C. MarinescuOffice: HEC 439 BOffice hours: Tu-Th 11:00-12:0022Lecture 21Lecture 21Last time: RelaxationNon-linear systemsRandom variables, probability distributions, Matlab support for random variablesTodayHistogramsLinear regressionLinear least squares regressionNon-linear data modelsNext TimeMultiple linear regressionGeneral linear squaresStatistics built-in functionsBuilt-in statistics functions for a column vector s: mean(s), median(s), mode(s)Calculate the mean, median, and mode of s. mode is a part of the statistics toolbox.min(s), max(s)Calculate the minimum and maximum value in s.var(s), std(s)Calculate the variance and standard deviation of sIf a matrix is given, the statistics will be returned for each column.Histograms [n, x] = hist(s, x)Determine the number of elements in each bin of data in s. x is a vector containing the center values of the bins.[n, x] = hist(s, m)Determine the number of elements in each bin of data in s using m bins. x will contain the centers of the bins. The default case is m=10 hist(s, x ) or hist(s, m) or hist(s)With no output arguments, hist will actually produce a histogram.Histogram ExampleLinear Least-Squares RegressionLinear least-squares regression is a method to determine the “best” coefficients in a linear model for given data set.“Best” for least-squares regression means minimizing the sum of the squares of the estimate residuals. For a straight line model, this gives:This method will yield a unique line for a given set of data.Sr ei2i1n yi a0 a1xi 2i1nLeast-Squares Fit of a Straight LineUsing the model:the slope and intercept producing the best fit can be found using:y a0 a1xa1n xiyi xiyin xi2 xi 2a0y a1xExampleV(m/s)F(N)i xiyi(xi)2xiyi1 10 25 100 2502 20 70 400 14003 30 380 900 114004 40 550 1600 220005 50 610 2500 305006 60 1220 3600 732007 70 830 4900 581008 80 1450 6400 116000360 5135 20400 312850a1n xiyi xiyin xi2 xi 28 312850 360 5135 8 20400 360 219.47024a0y a1x 641.875 19.47024 45 234.2857 Fest 234.285 7 19.47 02 4vNonlinear modelsLinear regression is predicated on the fact that the relationship between the dependent and independent variables is linear - this is not always the case.Three common examples are:exp onential : y 1e1xpower : y 2x2saturation - growth - rate : y 3x3 xLinearization of nonlinear modelsxyxxyxyxyxyeyx111:rate-growth-saturationloglog log:powerln ln:lexponentiaLinearizedNonlinearModel3333322211121Transformation ExamplesLinear Regression ProgramPolynomial least-fit squaresMATLAB has a built-in function polyfit that fits a least-squares n-th order polynomial to data:p = polyfit(x, y, n)x: independent datay: dependent datan: order of polynomial to fitp: coefficients of polynomialf(x)=p1xn+p2xn-1+…+pnx+pn+1MATLAB’s polyval command can be used to compute a value using the coefficients.y = polyval(p, x)Polynomial RegressionThe least-squares procedure from can be extended to fit data to a higher-order polynomial. The idea is to minimize the sum of the squares of the estimate residuals.The figure shows the same data fit with:a) A first order polynomialb) A second order polynomialProcess and Measures of FitFor a second order polynomial, the best fit would mean minimizing:In general, this would mean minimizing: The standard error for fitting an mth order polynomial to n data points is:because the mth order polynomial has (m+1) coefficients.The coefficient of determination r2 is still found using:Sr ei2i1n yi a0 a1xi a2xi2 2i1n Sr ei2i1n yi a0 a1xi a2xi2 L amxim 2i1nsy/ xSrn m 1 r2St
View Full Document