m + 1 zeros. This means that the coef¬cients pi (0) have to vanish for 0 ¤ i ¤ m, showing

in particular that the constant polynomial pm is zero. Now assume that our assumption is

2.2 Multivariate polynomials 23

true for j. Then, pm ≡ · · · ≡ pm’ j ≡ 0 shows that

m’ j’1

p(x, xd ) = pi (x)xid

i=0

and that pi |X d’1 = 0 for 0 ¤ i ¤ m ’ j ’ 1. Next ¬x an arbitrary β ∈ X d’1 \ X d’1 . Then

j j

j+1

|β| = j + 1 and the data points (β, k) are in X m for 0 ¤ k ¤ m ’ j ’ 1. Thus the univari-

d

ate polynomial p(β, ·), which is of degree m ’ j ’ 1, has m ’ j zeros and is therefore

identically zero. This means pi (β) = 0 for all β ∈ X d’1 and all 0 ¤ i ¤ m ’ j ’ 1, giving

j+1

in particular pm’ j’1 ≡ 0 by our induction assumption on d ’ 1. This ¬nishes our induction

on j and hence also the induction on d.

This last proof gives an idea how complicated the situation in more than one dimension

can be. If the data sites are part of a mildly disturbed grid, however, interpolation with

polynomials is always possible.

3

Local polynomial reproduction

One reason for studying polynomials in approximation theory is that they allow a simple

error analysis. For example, if the univariate function f possesses k continuous derivatives

around a point x0 ∈ R then the Taylor polynomial

k’1

f ( j) (x0 )

p(x) = (x ’ x0 ) j

j!

j=0

enjoys for |x ’ x0 | ¤ h the local approximation error

| f (k) (ξ )|

| f (x) ’ p(x)| = |x ’ x0 |k ¤ Ch k

k!

with ξ between x and x0 . This local approximation order is inherited by every approximation

process that recovers polynomials at least locally.

In this chapter we will work out these ideas more clearly by introducing the concept of

a local polynomial reproduction and discussing its existence. In the next chapter we will

give a concrete example of a local polynomial reproduction. But even the theoretical results

provided here are useful, since they often allow error estimates for other approximation

methods.

3.1 De¬nition and basic properties

Given a set X = {x1 , . . . , x N } of pairwise distinct points in ⊆ Rd and function values

f (x1 ), . . . , f (x N ), we are concerned with ¬nding an approximant s to the unknown function

f . One way to form such an approximant is to separate the in¬‚uence of the data sites and

the data values by choosing functions u j : ’ R, 1 ¤ j ¤ N , which depend only on X

and forming

N

s(x) = f (x j )u j (x).

j=1

If the functions u j are cardinal with respect to X , that is u j (xk ) = δ j,k for 1 ¤ j, k ¤ N ,

we obviously have an interpolant. The reader might think of Lagrange interpolation with

24

3.1 De¬nition and basic properties 25

univariate polynomials as an example. If the functions u j are not cardinal then s is sometimes

called a quasi-interpolant.

In what follows the ¬ll distance, which has already been introduced in Chapter 1, will play

an important role as it will throughout this book. For convenience we restate its de¬nition

here. For a set of points X = {x1 , . . . , x N } in a bounded domain ⊆ Rd the ¬ll distance is

de¬ned to be

h X, = sup min x ’ xj 2.

1¤ j¤N

x∈

Now it is time to come to the basic de¬nition of what we call a local polynomial repro-

duction.

De¬nition 3.1 A process that de¬nes for every set X = {x1 , . . . , x N } ⊆ a family of func-

tions u j = u X : ’ R, 1 ¤ j ¤ N , provides a local polynomial reproduction of degree

j

on if there exist constants h 0 , C1 , C2 > 0 such that

N

p(x j )u j = p for all p ∈ π (Rd )| ,

(1)

j=1

N

|u j (x)| ¤ C1 for all x ∈

(2) ,

j=1

(3) u j (x) = 0 if x ’ x j > C2 h X, and x ∈

2

is satis¬ed for all X with h X, ¤ h 0 .

Sometimes we will also say that {u j } forms a local polynomial reproduction or in a

more sloppy way that the quasi-interpolant s = f (x j )u j provides a local polynomial

reproduction.

The crucial point in the de¬nition is that the constants involved are independent of the

data sites. Moreover, so far we have not talked about the data values at all.

The ¬rst and the third condition justify the name local polynomial reproduction. The

second condition is important for the approximation property of the associated quasi-

interpolant. So far, we have not speci¬ed the functions u j any further. Obviously they

will depend on the set X , but they might not even be continuous. In the next chapter we will

show how to construct such functions having an arbitrary smoothness.

The reason for investigating local polynomial reproductions is that they allow a very

simple error analysis.

⊆ Rd is bounded. De¬ne — to be the closure of

Theorem 3.2 Suppose that

∪x∈ B(x, C2 h 0 ). De¬ne s f,X = N f (x j )u j , where {u j } is a local polynomial repro-

j=1

duction of order m on . If f ∈ C m+1 ( — ) then there exists a constant C > 0 depending

only on the constants from De¬nition 3.1 such that

| f (x) ’ s f,X (x)| ¤ Ch m+1 | f |C m+1 ( —)

X,

for all X with h X, ¤ h 0 . The semi-norm on the right-hand side is de¬ned by | f |C m+1 ( :=

—)

max|±|=m+1 D ± f L ∞ ( — ) .

26 Local polynomial reproduction

Proof Let p be an arbitrary polynomial from πm (Rd ). Then, using the properties of a local

polynomial reproduction gives

N

| f (x) ’ s f,X (x)| ¤ | f (x) ’ p(x)| + p(x) ’ f (x j )u j (x)

j=1

N

¤ | f (x) ’ p(x)| + | p(x j ) ’ f (x j )||u j (x)|

j=1

N

¤ f’p 1+ |u j (x)|

L ∞ (B(x,C2 h X, ))

j=1

¤ (1 + C1 ) f ’ p L ∞ (B(x,C2 h X, )) .

Here, B(x, r ) denotes the closed ball around x with radius r > 0. Now choose p to be the

Taylor polynomial of f around x. This gives for y ∈ a ξ ∈ — such that

D ± f (x) D ± f (ξ )

(y ’ x)± + (y ’ x)± .

f (y) =

±! ±!

|±|¤m |±|=m+1

Hence, we can conclude that

N

1