Академический Документы
Профессиональный Документы
Культура Документы
Copyright The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
Course Policies
Lecturer:
Dr Intan Salwani Ahamad @ ext. 4407 @ intan@eng.upm.edu.my Numerical Methods for Engineers, 5th Edition, Steven C. Chapra & Raymond P. Canale, McGraw-Hill International Edition, RM60. Serdang old flat bookshop. Lectures: R10-12 DK5, K12-1 DK5 Lab: S10-1 UIT3 (G1), K2-5 UIT3 (G2)
Textbook:
Timetable:
Objectives:
To describe the principles of optimization and simulation in chemical industry To apply the numerical methods in solving curve fitting and roots equations To apply numerical method for solving ordinary and partial differential equations To apply optimization methods in solving engineering problems PO1 (K): Mengaplikasikan pengetahuan matematik dan sains kejuruteraan. PO3(K): Menganalisis dan menafsir data PO4(K): Merekabentuk sistem, komponen atau proses untuk memenuhi keperluan rekabentuk. PO8(K): Mengenalpasti, merumus dan menyediakan penyelesaian kreatif, inovatif, dan berkesan bagi sesuatu masalah. PO14(S): Menggunakan kemahiran, teknik dan alat kejuruteraan untuk amalan dalam industri. PO15(K) Menyelesaikan masalah dalam rekabentuk dan pembangunan termaju.
3
Program Outcomes:
Curve Fitting
Describes techniques to fit curves (curve fitting) to discrete data to obtain intermediate estimates. There are two general approaches to curve fitting:
Data exhibit a significant degree of scatter. The strategy is to derive a single curve that represents the general trend of the data. (Fig. 5.1a) Data is very precise. The strategy is to pass a curve or a series of curves through each of the points. (Fig. 5.1b, c) For quick estimates, results are subjective, e.g. straight line (5.1a), linear interpolation (5.1b), curves (5.1c) from the same data Trend analysis. Predicting values of dependent variable, may include extrapolation beyond data points or interpolation between data points. Hypothesis testing. Comparing existing mathematical model with measured data.
4
Figure 5.1
Simple Statistics
In course of engineering study, if several measurements are made of a particular quantity, additional insight can be gained by summarizing the data in one or more well chosen statistics that convey as much information as possible about specific characteristics of the data set. These descriptive statistics are most often selected to represent
The location of the center of the distribution of the data The degree of spread of the data
Arithmetic mean. The sum of the individual data points (yi) divided by the number of points (n).
n i = 1, , n
y y=
St Sy = n 1 S t = ( yi y ) 2
or
2 y
y ( y ) =
2 i i
/n
n 1
( yi y ) 2 n 1
Degrees of freedom
c.v. =
Sy y
100%
Least-Squares Regression
Used for data with substantial error, e.g. experimental data Derive an approximation function that fits general trend without necessarily matching the individual points. One way to do it is to visually inspect the plotted data and sketch a best line through the points. Deficient, different analysts would draw different lines. Establish a basis e.g. derive a curve that minimizes the discrepancy between the data points and the curve least squares regression.
9
Fig 17.1
10
Linear Regression
Fitting a straight line to a set of paired observations: (x1, y1), (x2, y2),,(xn, yn). y = a0+a1x+e a1 - slope a0 - intercept e - error, or residual, between the model and the observations
11
Minimize the sum of the residual errors for all available data:
e = (y
i =1 i i =1
ao a1 xi )
e = y
i =1 i i =1
a0 a1 xi
Minimax criterion minimizes the maximum distance that an individual point falls from the line. Also an inadequate criterion (Fig. 17.3b)
13
Figure 17.2
14
Best strategy is to minimize the sum of the squares of the residuals between the measured y and the y calculated with the linear model: S r = ei2 = ( yi ,measured yi ,mod el ) 2 = ( yi a0 a1 xi ) 2
i =1 i =1 i =1 n n n
Yields a unique line for a given set of data. Need to find the values of a0 and a1 that minimize.
15
0 = yi a 0 a1 xi
0 = yi xi a 0 xi a1 xi2
na0 + ( xi ) a1 = yi a1 =
= na0
n xi yi xi yi n x ( xi )
2 i 2
Mean values
16
a0 = y a1 x
Figure 17.3
17
Figure 17.4
18
Figure 17.5
19
Goodness of our fit If Total sum of the squares around the mean for the dependent variable, y, is St
Sum of the squares of residuals around the regression line is Sr St-Sr quantifies the improvement or error reduction due to describing data in terms of a straight line rather than as an average value.
St S r r = St
2
For a perfect fit Sr=0 and r=r2=1, signifying that the line explains 100 percent of the variability of the data. For r=r2=0, Sr=St, the fit represents no improvement.
21
Polynomial Regression
Some engineering data is poorly represented by a straight line. For these cases a curve is better suited to fit the data. The least squares method can readily be extended to fit the data to higher order polynomials (Sec. 17.2).
22
{Y } = [ Z ]{ A} + { E} [ Z ] matrix of the calculated values of the basis functions { Y} observed valued of the dependent variable { A} unknown coefficients { E} residuals
S r = yi a j z ji i =1 j =0
n m 2
Minimized by taking its partial derivative w.r.t. each of the coefficients and setting the resulting equation equal to zero
23