You are on page 1of 83

Chapter 4 Interpolation and Approximation

4.1 Lagrange Interpolation

The basic interpolation problem can be posed in one of two ways:

exist

unique
4

Example 4.1

e-1/2

Discussion

The construction presented in this section is called Lagrange interpolation. How good is interpolation at approximating a function? (Sections 4.3, 4.11) Consider another example:

If we use a fourth-degree interpolating polynomial to approximate this function, the results are as shown in Figure 4.3 (a).

n=4

n=8

n = 16

Error for n=8

Discussion

There are circumstances in which polynomial interpolation as approximation will work very well, and other circumstances in which it will not. The Lagrange form of the interpolating polynomial is not well suited for actual computations, and there is an alternative construction that is far superior to it.

10

4.2 Newton Interpolation and Divided Differences

The disadvantage of the Lagrange form

If we decide to add a point to the set of nodes, we have to completely re-compute all of the functions.

Here we introduce an alternative form of the polynomial: the Newton form

It can allow us to easily write

in terms of

11

Newton Interpolation

12

=0

13

Example 4.2

14

Discussion

The coefficients are called divided differences. We can use divided-difference table to find them.

15

Example 4.3

16

17

Example 4.3 (Con.)

18

Table 4.5

19

4.3 Interpolation Error

20

4.5 Application: More Approximations to the Derivative

depends on x

21

4.5 Application: More Approximations to the Derivative

The interpolating polynomial in Lagrange form is

The error is given as in (4.20), thus


We get

22

We can use above equations to get:

23

4.7 Piecewise Polynomial Interpolation

If we keep the order of the polynomial fixed and use different polynomials over different intervals, with the length of the intervals getting smaller and smaller, then interpolation can be a very accurate and powerful approximation tool.

For example:

24

25

26

Example 4.6

27

28

29

4.8 An Introduction to Splines

4.8.1 Definition of the problem

30

Discussion

From the definition:

d: degree of approximation

Related to the number of unknown coefficients (the degrees of freedom) Related to the number of constraints

N: degree of smoothness

31

Discussion

We can make the first term vanish by setting This establishes a relationship between the polynomial degree of the spline and the smoothness degree. For example: cubic splines

If we consider the common case of cubic splines, then d =3 and N =2.

32

4.8.2 Cubic B-Splines

B-Spline: assume an uniform grid

33

34

Cubic B-Splines

How do we know that B(x) is a cubic spline function?

Computer the one-sided derivatives at the knots:

and similarly for the second derivative. If the one-sided values are equal to each other, then the first and second derivatives are continuous, and hence B is a cubic spline. Note that B is only locally defined, meaning that it is nonzero on only a small interval.
35

36

A Spline Approximation

We can use B to construct a spline approximation to an arbitrary function f. Define the sequence of functions

37

xi=0.4 h=0.05

xi=0.75 h=0.05

38

n+1 equations in n+3 unknowns


39

A Spline Approximation

Now, we need to come up with two additional constraints in order to eliminate two of the unknowns.

Two common choices are

The natural spline: A simple construction Leads to higher error near the end points The complete spline: Better approximation properties Do not actually require the derivative at the end points
40

Natural Spline
From

n-1

41

Complete Spline
From

n+1

42

Example 4.7

43

44

45

46

47

Example 4.8

48

49

50

51

52

Discussion

The advantage of spline interpolation lies in the smoothness of the approximation.


53

4.9 Application: Solution of Boundary Value Exercises

Consider the two-point boundary value problem:

We construct the uniform grid of points:

We now look for our approximation in the form of a cubic spline define on this grid. Consider the function:

54

The advantage of this approach is we can get a continuous smooth function. Because we know the values of and its derivatives at each of the nodes, we can easily reduce this to the system of equations: (n+1 equations in n+3 unknown) where

55

We can eliminate the two extra unknowns by imposing the boundary conditions on the approximation:

Substitute these into the first and last equations of the rectangular system, we get

56

We are then left with the square system: where

57

Example 4.9

58

where

The solution we get

59

60

4.10 Least Squares Concepts in Approximation

4.10.1 An introduction to data fitting

61

Least Square Data Fitting

62

63

64

Example 4.10

65

66

Example 4.11

67

68

69

70

71

4.10.2 Least Squares Approximation and Orthogonal Polynomials

Let , we can seek is minimized.

such that

72

Inner Productions

Inner productions of functions:

Inner product on real vector spaces:

73

Inner Productions

74

The definition of inner product will allow us to apply a number of ideas from linear algebra to the construction of approximations.

75

The system can be organized along matrix-vector lines as

If our basis function satisfy the orthogonality condition the special basis functions that satisfy this equation are called orthogonal polynomials. Then the above matrix is a diagonal matrix, and we very easily have

76

77

Orthogonal Polynomials

Legendre polynomials:

78

Example 4.12

79

80

81

Example 4.13

82

4.11 Advanced Topics in Interpolation Error

You can read it by yourselves.

83

You might also like