Вы находитесь на странице: 1из 23

Chapter 17

Copyright The McGraw-Hill Companies, Inc. Permission required for reproduction or display.

Course Policies

Lecturer:

Dr Intan Salwani Ahamad @ ext. 4407 @ intan@eng.upm.edu.my Numerical Methods for Engineers, 5th Edition, Steven C. Chapra & Raymond P. Canale, McGraw-Hill International Edition, RM60. Serdang old flat bookshop. Lectures: R10-12 DK5, K12-1 DK5 Lab: S10-1 UIT3 (G1), K2-5 UIT3 (G2)

Textbook:

Timetable:

Assessments: Tests (30%), Final (40%), Lab (20%), Others


(10%)
2

Objectives:

To describe the principles of optimization and simulation in chemical industry To apply the numerical methods in solving curve fitting and roots equations To apply numerical method for solving ordinary and partial differential equations To apply optimization methods in solving engineering problems PO1 (K): Mengaplikasikan pengetahuan matematik dan sains kejuruteraan. PO3(K): Menganalisis dan menafsir data PO4(K): Merekabentuk sistem, komponen atau proses untuk memenuhi keperluan rekabentuk. PO8(K): Mengenalpasti, merumus dan menyediakan penyelesaian kreatif, inovatif, dan berkesan bagi sesuatu masalah. PO14(S): Menggunakan kemahiran, teknik dan alat kejuruteraan untuk amalan dalam industri. PO15(K) Menyelesaikan masalah dalam rekabentuk dan pembangunan termaju.
3

Program Outcomes:

Curve Fitting

Describes techniques to fit curves (curve fitting) to discrete data to obtain intermediate estimates. There are two general approaches to curve fitting:

Data exhibit a significant degree of scatter. The strategy is to derive a single curve that represents the general trend of the data. (Fig. 5.1a) Data is very precise. The strategy is to pass a curve or a series of curves through each of the points. (Fig. 5.1b, c) For quick estimates, results are subjective, e.g. straight line (5.1a), linear interpolation (5.1b), curves (5.1c) from the same data Trend analysis. Predicting values of dependent variable, may include extrapolation beyond data points or interpolation between data points. Hypothesis testing. Comparing existing mathematical model with measured data.
4

Noncomputer methods for curve fitting:

In engineering two types of applications are encountered:

Figure 5.1

Simple Statistics

In course of engineering study, if several measurements are made of a particular quantity, additional insight can be gained by summarizing the data in one or more well chosen statistics that convey as much information as possible about specific characteristics of the data set. These descriptive statistics are most often selected to represent

The location of the center of the distribution of the data The degree of spread of the data

Arithmetic mean. The sum of the individual data points (yi) divided by the number of points (n).
n i = 1, , n

y y=

Standard deviation. The most common measure of a spread for a sample.

St Sy = n 1 S t = ( yi y ) 2

or

2 y

y ( y ) =
2 i i

/n

n 1

Variance. Representation of spread by the square of the standard deviation.


2 Sy =

( yi y ) 2 n 1

Degrees of freedom

Coefficient of variation. Has the utility to quantify the spread of data.

c.v. =

Sy y

100%

Least-Squares Regression

Used for data with substantial error, e.g. experimental data Derive an approximation function that fits general trend without necessarily matching the individual points. One way to do it is to visually inspect the plotted data and sketch a best line through the points. Deficient, different analysts would draw different lines. Establish a basis e.g. derive a curve that minimizes the discrepancy between the data points and the curve least squares regression.
9

Fig 17.1

10

Linear Regression

Fitting a straight line to a set of paired observations: (x1, y1), (x2, y2),,(xn, yn). y = a0+a1x+e a1 - slope a0 - intercept e - error, or residual, between the model and the observations

11

Criteria for a Best Fit

Minimize the sum of the residual errors for all available data:

e = (y
i =1 i i =1

ao a1 xi )

n = total number of points However, this is an inadequate criterion (Fig. 17.2a).


Any straight line passing through the midpoint of the connecting line results in a minimum value equal to zero because the errors cancel
n n

Minimize the sum of the absolute values

e = y
i =1 i i =1

a0 a1 xi

This is also an inadequate criterion (Fig. 17.2b).


Any straight line falling within the dashed lines will minimize the sum of the absolute values
12

Minimax criterion minimizes the maximum distance that an individual point falls from the line. Also an inadequate criterion (Fig. 17.3b)

Influence from an outlier, a single point with a large error.

13

Figure 17.2

14

Best strategy is to minimize the sum of the squares of the residuals between the measured y and the y calculated with the linear model: S r = ei2 = ( yi ,measured yi ,mod el ) 2 = ( yi a0 a1 xi ) 2
i =1 i =1 i =1 n n n

Yields a unique line for a given set of data. Need to find the values of a0 and a1 that minimize.

15

List-Squares Fit of a Straight Line


S r = 2 ( yi ao a1 xi ) = 0 ao S r = 2 [ ( yi ao a1 xi ) xi ] = 0 a1

0 = yi a 0 a1 xi

0 = yi xi a 0 xi a1 xi2

na0 + ( xi ) a1 = yi a1 =

= na0

Normal equations, can be solved simultaneously

n xi yi xi yi n x ( xi )
2 i 2

Mean values
16

a0 = y a1 x

Figure 17.3

17

Figure 17.4

18

Figure 17.5

19

Goodness of our fit If Total sum of the squares around the mean for the dependent variable, y, is St

Sum of the squares of residuals around the regression line is Sr St-Sr quantifies the improvement or error reduction due to describing data in terms of a straight line rather than as an average value.

St S r r = St
2

r2-coefficient of determination Sqrt(r2) correlation coefficient


20

For a perfect fit Sr=0 and r=r2=1, signifying that the line explains 100 percent of the variability of the data. For r=r2=0, Sr=St, the fit represents no improvement.

21

Polynomial Regression

Some engineering data is poorly represented by a straight line. For these cases a curve is better suited to fit the data. The least squares method can readily be extended to fit the data to higher order polynomials (Sec. 17.2).

22

General Linear Least Squares


y = a0 z0 + a1 z1 + a2 z 2 + + am z m + e

{Y } = [ Z ]{ A} + { E} [ Z ] matrix of the calculated values of the basis functions { Y} observed valued of the dependent variable { A} unknown coefficients { E} residuals
S r = yi a j z ji i =1 j =0
n m 2

z0 , z1, , zm are m + 1 basis functions

at the measured values of the independent variable

Minimized by taking its partial derivative w.r.t. each of the coefficients and setting the resulting equation equal to zero
23

Вам также может понравиться