DOC PREVIEW
UCF EGN 3420 - Engineering Analysis Lecture Notes

This preview shows page 1-2-24-25 out of 25 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 25 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

Engineering Analysis ENG 3420 Fall 2009Lecture 22Quantification of ErrorsLinear regression versus the sample meanCoefficient of DeterminationExamplePolynomial least-fit squaresFitting an mth order polynomial to n data pointsMultiple Linear Regression General Linear Least SquaresSolving General Linear Least Squares CoefficientsExampleNonlinear Models ExampleComparison between the transformed of the power equation and the direct method in our examplePolynomial InterpolationMatrix formulation of polynomial interpolation: find the coefficients p1, p2 … pn knowing the values of the function f(x1),f(Ill conditioned linear problemsProblemsNewton Interpolating PolynomialsFirst-order Newton interpolating polynomial Second-order Newton interpolating polynomialNewton interpolating polynomial of degree n-1Divided differencesEngineering Analysis ENG 3420 Fall 2009Dan C. MarinescuOffice: HEC 439 BOffice hours: Tu-Th 11:00-12:0022Lecture 22Lecture 22 Attention: The last homework HW5 and the last project are due onTuesday November 24!! Last time:  Linear regression Exponential, power, and saturation non-linear models Linear least squares regression Today Linear regression versus sample mean. Coefficient of determination Polynomial least squares fit Multiple linear regression General linear squares More on non-linear models Interpolation (Chapter 15) Polynomial interpolation Newton interpolating polynomials Lagrange interpolating polynomials Next Time SplinesQuantification of Errors For a straight line the sum of the squares of the estimate residuals is: The standard error of the estimate:sy/ x=Srn − 2Sr= ei2i=1n∑= yi− a0− a1xi()2i=1n∑Linear regression versus the sample mean What is the difference between linear regression and the case when we simply compute the sample mean and draw a line corresponding to the sample mean? The spread Æ the histogram of the differences between the values predicted bylinear regression and the actual sample values. Regression data showing (a) the spread of data around the mean of the dependent data and (b) the spread of the data around the best fit line: The reduction in spread represents the improvement due to linear regression.Coefficient of Determination The coefficient of determination r2Æ r2represents the percentage of the original uncertainty explained by the model. For a perfect fit, Sr=0 and r2=1. If r2=0, there is no improvement over simply picking the mean. If r2<0, the model is worse than simply picking the mean!r2=St−SrSt()∑=−=niityyS12ExampleV(m/s)F(N)(yi-a0-a1xi)2417172459113016699818378918016044216118ixiyia0+a1xi(yi-ȳ)21 10 25 -39.58 3805352 20 70 155.12 3270413 30 380 349.82 685794 40 550 544.52 84415 50 610 739.23 10166 60 1220 933.93 3342297 70 830 1128.63 353918 80 1450 1323.33 653066Σ360 5135 1808297Fest=−234.2857+19.47024vSt= yi− y ()2∑=1808297Sr= yi− a0− a1xi()2∑= 216118sy=18082978 −1= 508.26sy/ x=2161188 − 2=189.79r2=1808297 − 2161181808297= 0.880588.05% of the original uncertaintyhas been explained by the linear modelPolynomial least-fit squares MATLAB has a built-in function polyfit that fits a least-squares n-th order polynomial to data: p = polyfit(x, y, n) x: independent data y: dependent data n: order of polynomial to fit p: coefficients of polynomialf(x)=p1xn+p2xn-1+…+pnx+pn+1 MATLAB’s polyval command can be used to compute a value using the coefficients. y = polyval(p, x)Fitting an mthorder polynomial to n data points Minimize: The standard error is:because the mthorder polynomial has (m+1) coefficients. The coefficient of determination r2is:()∑=−=niityyS12 Sr= ei2i=1n∑= yi− a0− a1xi− a2xi2−L− amxim()2i=1n∑sy/ x=Srn − m +1()r2=St− SrStMultiple Linear Regression Now y is a linear function of two or more independent variables. The best fit Æminimize the sum of the squares of the estimate residuals: For example when:instead of a line we have a plane Sr= ei2i=1n∑= yi−a0−a1x1,i−a2x2,i−Lamxm,i()2i=1n∑ y = a0+ a1x1+ a2x2+Lamxm22110xaxaay++=General Linear Least Squares Linear, polynomial, and multiple linear regression all belong to the general linear least-squares model:where z0, z1, …, zmare a set of m+1 basis functions and e is the error of the fit. The basis functions can be any function data but cannot contain any of the coefficients a0, a1, etc. The equation can be re-written for each data point as a matrix equation:where {y} is a vector of n dependent data, {a} is a vector of (m+1) coefficients of the equation, {e} contains the error at each point, and [Z] is: with zjirepresenting the value of the jthbasis function calculated at the ithpoint. Z[]=z01z11L zm1z02z12L zm2MMOMz0nz1nL zmn⎡ ⎣ ⎢ ⎢ ⎢ ⎤ ⎦ ⎥ ⎥ ⎥ y = a0z0+a1z1+a2z2+Lamzm+ ey{}=Z[]a{}+e{}Solving General Linear Least Squares Coefficients Generally, [Z] is an n x (m+1) matrix. Simple inversion cannot be used to solve for the (m+1) {a}. Instead the sum of the squares of the estimate residuals is minimized: The outcome of this minimization yields:Sr= ei2i=1n∑= yi− ajzjij=0m∑⎛ ⎝ ⎜ ⎜ ⎞ ⎠ ⎟ ⎟ 2i=1n∑Z[]TZ[][]a{}= Z[]Ty{}{}Example Given the colum vectors x and y, find the coefficients for best fit line y=a0+a1x+a2x2Z = [ones(size(x) x x.^2]a = (Z’*Z)\(Z’*y) MATLAB’s left-divide will automatically include the [Z]Tterms if the matrix is not square, soa = Z\ywould work as well To calculate measures of fit:St = sum((y-mean(y)).^2)Sr = sum((y-Z*a).^2)r2 = 1-Sr/Stsyx = sqrt(Sr/(length(x)-length(a)))Nonlinear Models  How to deal with nonlinear models (when we cannot fit a straight line) to the sample data Transform the variables and solve for the best fit of the transformed variables. This works well for exponential, power, saturation models but not all equations can be transformed easily or at all. Perform nonlinear regression to directly determine the least-squares fit. To perform nonlinear regression: write a function that returns the sum of the squares of the estimate residuals for a fit and then  use fminsearch function to find the values of the coefficients where a minimum occurs. The arguments to the function to compute Srshould be the coefficients, the independent variables, and the dependent variables.Example Given two vectors of n observations ym for the force F and xm for the


View Full Document

UCF EGN 3420 - Engineering Analysis Lecture Notes

Download Engineering Analysis Lecture Notes
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Engineering Analysis Lecture Notes and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Engineering Analysis Lecture Notes 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?