Pertemuan 6

23
GEOKOMPUTA Pertemuan ke-6 ASI

description

fdgf

Transcript of Pertemuan 6

  • GEOKOMPUTASIPertemuan ke-6

    GEOKOMPUTASI

  • Curve Fitting and Optimization

    Introduction

    Least Square Regression

    Linear

    Non-Linear

    Curve Fitting and Optimization

  • Introduction

    Data is often given for discrete values along a continuum. However, you may require estimates at points between the discrete values

    Describes techniques to fit curves to such data to obtain intermediate estimatesestimates

    General approaches for curve fitting : where the data exhibits a significant degree of error or noise, the strategy is to

    derive a single curve that represents the general trend of the datathis nature is called least-squares regression

    where the data is known to be very precise, the basic approach is to fit a curveseries of curves that pass directly through each of the pointsvalues between well-known discrete points is called

    Data is often given for discrete values along a continuum. However, estimates at points between the discrete values

    techniques to fit curves to such data to obtain intermediate

    eneral approaches for curve fitting :where the data exhibits a significant degree of error or noise, the strategy is to derive a single curve that represents the general trend of the data. One approach of

    squares regressionwhere the data is known to be very precise, the basic approach is to fit a curve or a series of curves that pass directly through each of the points. The estimation of

    points is called interpolation

  • Three attempts to fit a best curve through five data points.

    (a) Least-squares regression, (b) linear interpolation,

    Three attempts to fit a best curve through five data

    b) linear interpolation, (c) curvilinear interpolation

  • Two types of applications are generally encountered when fittingexperimental data: trend analysis and hypothesis testing.

    Trend Analysis :represents the process of using the patternUsed to predict or forecast values of the dependent variableUsed to predict or forecast values of the dependent variable

    Hypothesis Testing :an existing mathematical model is compared with measured datamodel coefficients are unknown, it may be necessary to determine values that best fit the observed data. On the other hand, if estimates of the model coefficients are already available, it mayvalues of the model with observed values to test theOften, alternative models are compared and the best one is selectedbasis of empirical observations.

    Two types of applications are generally encountered when fittingexperimental data: trend analysis and hypothesis testing.

    pattern of the data to make predictions. to predict or forecast values of the dependent variableto predict or forecast values of the dependent variable

    is compared with measured data. If the are unknown, it may be necessary to determine values that

    data. On the other hand, if estimates of the model coefficients are already available, it may be appropriate to compare predicted values of the model with observed values to test the adequacy of the model. Often, alternative models are compared and the best one is selected on the

  • Least Square Regression

    (a) Data exhibiting significant

    error,

    (b) Polynomial fit

    oscillating beyond the range of

    the data

    (b) Polynomial fit

    oscillating beyond the range of(c) More satisfactory

    result using the least-squares fit.

  • Linear Regression

    The simplest example of a least-square approximation is fitting a straight line paired observation : (x1, y1), (x2, y2), . . . , (

    y = a0 + a1x + e a0 = coefficients representing the intercept

    a coefficients representing the slope a1 = coefficients representing the slope

    e = error, or residual between the model and the observations

    e = y - a0 - a1x

    The error, or residual, is the discrepancy between the true value of y and the approximate value, a0 + a1x , predicted by the linear equation

    square approximation is fitting a straight line paired ), . . . , (xn, yn).

    coefficients representing the intercept

    coefficients representing the slopecoefficients representing the slope

    error, or residual between the model and the observations

    he error, or residual, is the discrepancy between the true value of y and the predicted by the linear equation

  • Linear Regression

    Criteria for a "Best" Fit

    minimize the sum of the residual errors for all the available data

    (a) minimizes the sum of the residuals,

    minimize the sum of the residual errors for all the available data

    of the residuals,

  • Linear Regression

    Criteria for a "Best" Fit

    (b) minimizes the sum of the absolute

    values of the residuals,

    b) minimizes the sum of the absolute

    values of the residuals,

  • Linear Regression

    Criteria for a "Best" Fit A third strategy for fitting a best line is the

    technique, the line is chosen that minimizes the maximum distance that an individual point falls from the line

    It should be noted that the minimax principle is sometimes wellfitting a simple function to a complicatedWilkes, 1969).

    (c) Minimizes the maximum error of any individual point

    third strategy for fitting a best line is the minimax criterion. In this is chosen that minimizes the maximum distance that an

    principle is sometimes well-suited for fitting a simple function to a complicated function (Carnahan, Luther, and

  • Linear Regression

    Minimize the sum of the squares of the residuals between the measured y and the y calculated with the linear model

    the sum of the squares of the residuals between the with the linear model

  • Linear Regression

    Least-Squares Fit of a Straight Line

    To determine values for a0 + a1

    Setting these derivatives equal to zero will result

    in a minimum Sr

    Squares Fit of a Straight Line

    Setting these derivatives equal to zero will result

  • realizing that a0 = na0

    where y and x are the means of y and x, respectively.y and x are the means of y and x, respectively.

  • Example

    Fit a straight line to the x and y values in the first two columns

  • SOLUTION

    Xi Yi Xi^2

    1 0.5 1

    2 2.5 4

    3 2 9

    4 4 164 4 16

    5 3.5 25

    6 6 36

    7 5.5 49

    jml X 28 jml Y 24 jml Xi^2 140

    rata X 4 rata Y 3.428571

  • Quantification of Error of Linear Regression

    the square of the residual represented the square of the discrepancy between the data and a single estimatebetween the data and a single estimatetendencythe mean

    the square of the residual represents the square of the vertical distance between the data and another measure ofstraight line

    Quantification of Error of Linear Regression

    of the residual represented the square of the discrepancy between the data and a single estimate of the measure of central between the data and a single estimate of the measure of central

    represents the square of the vertical distance between the data and another measure of central tendencythe

  • Regression data showing (a) the spread of the data around the mean of the dependent variable

    data around the best-fit line. The reduction in the spread in going from

    the right, represents the improvement due to linear regression

    a) the spread of the data around the mean of the dependent variable and (b) the spread of the

    fit line. The reduction in the spread in going from (a) to (b), as indicated by the bell-shaped curves at

    linear regression

  • Polynomial Regression

    For example, suppose that we fit a secondquadraticFor example, suppose that we fit a second-order polynomial or

  • where all summations are from i = 1 through n. Note that the above three equations are linear and have three unknowns: The coefficients of the unknowns can be calculatedobserved data.

    The two-dimensional case can be easily extended to an polynomial as

    the standard error is formulated asmth

    = 1 through n. Note that the above and have three unknowns: a0, a1, and a2.

    The coefficients of the unknowns can be calculated directly from the

    dimensional case can be easily extended to an mth-order

    mth-order polynomial

  • EXAMPLE

    Fit a second-order polynomial to the data in the first

    two columns

    order polynomial to the data in the first

  • SOLUTION

    Solving these equations through a technique such as Gauss

    elimination gives

    a1 = 2.35929, and a2 = 1.86071. Therefore, the least

    quadratic equation for this

    case is y = 2.47857 + 2.35929x + 1.86071x2

    The standard error of the estimate based on the regression

    polynomial is

    Solving these equations through a technique such as Gauss

    elimination gives a0 = 2.47857,

    a1 = 2.35929, and a2 = 1.86071. Therefore, the least-squares

    quadratic equation for this

    y = 2.47857 + 2.35929x + 1.86071x2

    The standard error of the estimate based on the regression