You are on page 1of 18

LOCAL LINEAR ESTIMATION OF SCALAR

DIFFUSION MODELS BY USING ASYMMETRIC


KERNELS
Muhammad Hanif
Department of Mathematics, Zhejiang University,
Hangzhou, China, 310027

Abstract: In this paper, we study the nonparametric estimation of rst and second innitesimal moments by using the local linear method of the underlying scalar diusion model. The special attention is paid
to the asymmetric kernels instead of standard kernel smoothing. We focus on the use of asymmetric
kernels when the diffusion process lives on a support with one or two boundaries. We establish
the pointwise consistency and asymptotic normality of the proposed estimators under the condition of
recurrence and stationarity.
Keywords: Beta kernel; Gamma kernel; Harris recurrence; Diusion model; Local time; Local linear
method; Nonparametric estimation.

Introduction

In this paper, we present the nonparametric estimation of rst and second innitesimal moments by using
the local linear method of the underlying scalar diusion model. In this work, we concentrate on asymmetric
kernels (Beta kernel, Gamma kernel) instead of conventional standard kernel smoothing. Moreover, the
use of asymmetric kernels is appropriate when the diffusion lives on a support with one or two
boundaries. Nonparametric estimation based on smoothing kernel methods has been received considerable
attention in the statistical literature for many years. A popular approach for estimating the coecients of
underlying model by using xed symmetric kernel function depends on a bounded support and a bandwidth
parameter. The kernel determines the shape of underlying model while the bandwidth controls the degree

Corresponding author, mhpuno@hotmail.com

of smoothness. It is well known from the literature that the kernel has less impact than the bandwidth on
the resulting estimate. For a review about xed kernel smoothing see, e.g., Arapis and Gao (2006), Bandi
and Phillips (2003), Jiang and Knight (1997), Nicolau (2003) and Stanton (1997) and many others.
A xed kernel smoothing causes boundary bias due to the allocation of weights outside the support. In
order to overcome the boundary bias problem, Chen (2000) considered the asymmetric kernel estimators.
These asymmetric kernels are nonnegative and free of boundary bias. Moreover, their shape varies according
to the location of the design point. Usually, researchers refer to asymmetric kernels as kernel functions
with support on the nonnegative real line. Bouezmarni and Robin (2003), Brown and Chen (1999), and
Jones and Henderson (2007) considered estimation of density and regression functions dened over the unit
interval using dierent versions of asymmetric kernels. For the further references on asymmetric kernels see,
e.g., Bouezmarni and Scaillet (2005), Gustafsonn et al. (2009), Hagmann and Scaillet (2007)
Renault and Scaillet (2004), Scaillet (2004) and Shunpu (2010).
Several nonparametric methods are available in the literature for estimating the coecients of the underlying models. One of the most widely used and studied method in the literature is the Nadaraya-Watson
method. Fan and Gijbels (1996) showed that this method suered from several severe drawbacks, such as
poor boundary performances, larger bias and low eciency. In this situation the local polynomial methods
developed by Stone (1977) is generally preferable. Local polynomial tting, and particularly its special case
local linear method have become increasingly popular in recent years. Local linear is a useful method with
many advantages and overcomes the diculties associated with Nadaraya-Watson method. Local linear estimator having some nice features, such as their independence of the regression design, their eciency and
nice boundary behavior. For detail discussion of local linear estimator and other smoothing kernel function
see, for example the monographs by Eubank (1988), Hardle (1990), Cleveland (1979), Fan (1992), Fan and
Gijbels (1992), Ruppert and Wand (1994), and several others.
Consider continuous-time Ito diusion model dened by the following stochastic dierential equation:
dXt = (Xt )dt + (Xt )dWt ,

(1.1)

where the functions (x) and 2 (x) are the drift function and diusion function respectively, and {Wt , 0
t T } is a standard Brownian motion. This time homogeneous diusion model has been widely used for
describing the properties of underlying economic variables. Gospodinovy and Hirukawa (2011) studied the
Gamma kernel for Nadaraya-Watson method of diusion model. He obtained the asymptotic normality for
the drift and diusion functions. The purpose of this paper is to study the nonparametric estimation of rst

and second innitesimal moments by using the local linear method for the diusion model. We obtain the
pointwise consistency and asymptotic normality of estimators under recurrence and stationarity.
The rest of the paper is organized as follows: Section 2 introduces the asymmetric kernels (Beta and
Gamma) and the local linear estimator. In this section, we also dene the conditional moments of the
diusion model and derive the asymmetric kernel estimators for the conditional moments. In Section 3, we
present assumptions and some useful preliminary results about the local time. The main ndings of the
paper are summarized in the same section (section 3). Section 4 contains the proofs of the presented results.

Beta kernel and local linear estimator

The standard kernel estimator for density function f is dened as


1
f(x) =
K((x Xi )/h),
nh
n

i=1

where K and h are symmetric kernel function and smoothing bandwidth respectively.
Chen (2000) proposed a Beta kernel function with compact support [0, 1], dened by
KB(x,
where B(l, m) =

1
0

b) (a)

ax/b (1 a)(1x)/b
,
B{x/b + 1, (1 x)/b + 1}

y l1 (1 y)m1 , l, m > 0 is a Beta function with smoothing parameter b. It implies that

the kernel is the density of B{x/b + 1, (1 x)/b + 1} distribution. The Gamma kernel function is represented
by
KG(x/b+1,
where (m) =

b) (a)

ax/b exp(a/b)
bx/b+1 (x/b + 1)

y m1 exp(y)dy, m > 0 is the Gamma function and b the smoothing parameter. The

density function of Gamma distribution has support [0, ). As usual, the smoothing bandwidth b converges
to zero as the sample size n grows. The asymmetric kernels are exible enough that their shape and the
amount of smoothing vary according to the location within the support. The asymptotic properties of the
estimators depend on the position of the design point x, let interior x and boundary xis a design point
x which satises x/b and x/b k for some k > 0 as n, T , respectively.
Remark 1. Beta kernels are particularly appropriate to estimate densities with compact support,
whereas Gamma kernels are more convenient to handle density functions whose supports are bounded
from one end only. Both Beta and Gamma kernel estimators achieve the optimal rate of convergence for the
mean integrated squared error of nonnegative kernels.

The conditional moments of the diusion model that will form the basis for our estimation procedure
are dened by the following relations,
1
E[Xt+ Xt |Xt = x] = (x),
0
1
M 2 (x) := lim E[(Xt+ Xt )2 |Xt = x]
0

M 1 (x) := lim

(2.1)
(2.2)

(c.f., Bandi and Phillips (2003)).


The kernel smoothing method has become a powerful and useful tool for data analysis. The estimator
is obtained by locally tting pth polynomial to the data via weighted least squares . To describe the local
linear method, suppose that the second derivative of regression function m exists, then, in a neighborhood
of a point t,
m(x) = m(t) + m (t)(x t) + o((x t)) a + b(x t),
by using the Taylor expansion. By considering a weighted local linear regression, nd the estimators of a
and b by minimizing
n

(Yi a b(x Xi ))2 KB(x,

b) (Xi ),

(2.3)

i=1

where h is a bandwidth. The regression estimators


n
wi Yi
a
= m(x)

= i=1
n
i=1 wi
and
b2 (x) =

n
2
i=1 wi Yi

n
i=1 wi

with wiLL (x, b) = [Sn,2 (Xi )Sn,1 ] are the local linear estimators of m(x) and b2 (x) respectively, Under
some regularity conditions, the drift and diusion in diusion process (1.1) are the rst two moments of the
innitesimal conditional distribution of Xt
X(t+1) Xt
|Xt = x],
0

(X(t+1) Xt )2
2 (x) = lim E[
|Xt = x]
0

(x) = lim E[

(2.4)
(2.5)

respectively. The drift describes the movement of Xt due to time changes, whereas the diusion term
measures the magnitude of random uctuations around the drift.
Following Fan and Zhang (2003), the local polynomial estimators of M 1 (x) and M 2 (x) are denoted by
1 (x, b) and M
2 (x, b) respectively that minimize the weighted sum of squares
M
LL
LL

2
p
n

(X(i+1) Xin,T )/n,T


j (Xin,T x)j KB(x, b) (Xin,T )
n,T
i=1

j=0

and
n

(X(i+1)

n,T

Xin,T )2 /n,T

i=1

2
j (Xin,T x)j KB(x,

b) (Xin,T ).

j=0

Local linear estimators of M 1 (x) and M 2 (x) for p = 1, have the following forms
n1
1
LL
M
(x, b)

i=1

and

n1
i=1

2
LL
M
(x, b) =

(X(i+1)

Xin,T )

n,T
wiLL (x, b)KB(x, b) (Xin,T )
n,T
n
LL
i=1 wi (x, b)KB(x, b) (Xin,T )

(X(i+1)

(2.6)

Xin,T )2

n,T
wiLL (x, b)KB(x, b) (Xin,T )
n,T
n
LL
i=1 wi (x, b)KB(x, b) (Xin,T )

(2.7)

respectively. Similarly the local linear estimators of M 1 (x) and M 2 (x) for Gamma kernel are obtained by
replacing KB(x,

b)

with KG(x/b+1,

b)

in (2.6) and (2.7).

In this paper, we studied the local linear estimators of rst and second innitesimal moments of diusion
model by using asymmetric kernels (Beta and Gamma). We obtain the pointwise consistency and asymptotic
normality of the proposed estimators under recurrence and stationarity.

Assumptions and main Results


The basic assumptions are listed and discussed below. Assume that I = [0, 1] is the range of the process

Xt . The scale density function for the process Xt is dened as

exp

s() =
z0

z0

}
2(x)
dx
dz,
2 (x)

where z0 is any xed point belonging to [0, 1]. Moreover, m(x) = 2( 2 (x)s (x))1 denotes the speed function
of the process Xt .
A1.
(i) The coecients () and () have continuous derivatives of order 2 and satisfy

|(x) (z)| + |(x) (z)| C1 |x z|

(3.1)

|(x)| + |(x)| C2 {1 + |x|}.

(3.2)

and

(ii) 2 () > 0.
Remark 2. Assumption A1 assures existence and uniqueness of a strong solution to the stochastic
dierential equation (1.1).
5

A2.
The solution to Eq. (1.1) is positive Harris recurrent.
Remark 3. Harris recurrence guarantees the existence and uniqueness of invariant measure s(dx). This
assumption is much weaker than usual assumption.
A3.
T , n,T := T /n 0 and b = bn,T 0 as n such that
)1
X (T, x) (
2
L
1
(
= oa.s. (1).
) n,T log(
)
b
n,T

(3.3)

A4.
T , n,T := T /n 0 and b = bn,T 0 as n such that
(
)1
2
T
1
( ) n,T log(
= oa.s. (1).
)
b
n,T

(3.4)

Following lemmas are useful in obtaing the main results of our paper (c.f., Protter (1995) and Revuz
and Yor (1998)).
Lemma 1. Let the process Xt be a semimartingale with quadratic variation [X]cs and LX (t, a) be local
time at a. Then for every positive bounded Borel measurable function g

g(Xs )d[X]cs

LX (t, a)g(a)da =
0

Lemma 2. Let Xt be a semimartingale satisfying


t, we have
X (t, a) := lim 1
L
0 2

a.s.

0<st |Xs |

1(|Xt a|) d[X]cs =

< a.s. t > 0. Then, for any a and

1
2 (a)

LX (t, a) a.s.

The convergence rate of nonparametric estimator for a diusion function can be expressed by local time
and dened by
(k) (T, x) =
L
n,T
X

KB(x,

k
b) (Xin,T ) .

(3.5)

i=1

Lemma 3. (Bandi and Phillips, 2003) Let Xt be the process dened by (1.1) and b = bn,T 0 as
n . If
1
b

(
)1
2
1
= o(1),
n,T log(
)
n,T

(3.6)

then
a.s.
(k) (T, x)
X (T, x).
L
Bk L
X
Remark 4. In symmetric kernel function, KB(x,
the other hand KB(x,

b)

b)

can be replaced by K(

(3.7)
Xin,T x
),
hn,T

is not xed and it varies along with the design point x.


6

which is xed. On

We now give the main results of our paper.

THEOREM 1. Under Assumptions A1-A3,


a.s.
1
LL
M
(x, b) M 1 (x).

X (T, x) = Oa.s. (1),


Then for interior x, if b5/2 L

b1/2 L

where BM 1

LL (x,b)

1
1
X (T, x) MLL (x, b) M (x) bBM
1

LL (x,b)

(
N

M 2 (x)BM
0,
2 x(1 x)

)
,

(3.8)

1 (x, b), given by


denotes bias of the estimator M
LL
BM 1

LL (x,b)

[
{
}]
B2 B1
x(1 x)s (x)
1

=
(M (x)) 1 +
.
[B2 (B1 )2 ]
s(x)

Also
BM =
with B1 =

1
0

KB(x,

b) (a)(a)

and B2 =

1
0

[(B2 )2 (B1 )2 B2 ]
[B2 (B1 )2 ]2

KB(x,

b) (a)(a)

2.

X (T, x) =
Similarly for boundary x, if b3 L

Oa.s. (1), then

(
1
X (T, x) M
LL
bL
(x, b) M 1 (x) bBM 1

LL (x,b)

(
N

M 2 (x)(2k + 1)BM
0,
22k+1 2 (k + 1)

)
.

(3.9)

THEOREM 2. Under Assumptions A1-A3,


a.s.
2 (x, b)
M
M 2 (x).
LL

Then for interior x, if

X (T,x)
b5/2 L
n,T

)
X (T, x) (
b1/2 L
2 (x, b) M 2 (x) bB 2
M
LL
MLL (x,b) N
n,T

where
BM 2

LL (x,b)

2(M 2 (x))2 BM
0,
x(1 x)

)
,

(3.10)

)
(
4(M 2 (x))2 (2k + 1)BM
.
0,
22k+1 2 (k + 1)

(3.11)

[
{
}]
B2 B1
x(1 x)s (x)
2

=
(M (x)) 1 +
.
[B2 (B1 )2 ]
s(x)

Furthermore, for boundary x, if

= Oa.s. (1),

X (T,x)
b3 L
n,T

= Oa.s. (1), then

)
X (T, x) (
bL
2
LL
M
(x, b) M 2 (x) bBM 2 (x,b) N
LL
n,T

The local linear estimators for Beta kernel can also be obtained under the condition of stationarity. To

obtain the results under stationary, we use the condition that 0 m(x)dx < . Moreover, stationary cona.s.
a.s.
X (T, x)/T
dition implies that L
f (x) and s (x)/s(x) f (x)/f (x). We have the following corollaries
for interior x and for boundary x respectively.
7

COROLLARY 1. Under Assumptions A1 and A4,


a.s.
1
LL
M
(x, b) M 1 (x).

Then for interior x, if b = O(T 2/5 ), then

where BM 1

LL (x)

T b1/2

1 (x, b)
M
LL

M (x) bBM 1
1

LL (x,b)

(
N

M 2 (x)BM
0,
2 x(1 x)f (x)

)
,

(3.12)

1 (x, b), given by


denotes bias of the estimator M
LL
BM 1

LL (x,b)

[
{
}]
B2 B1
x(1 x)f (x)
1

=
(M (x)) 1 +
.
[B2 (B1 )2 ]
f (x)

Similarly for boundary x, if b = O(T 1/3 ), then

(
1 (x, b) M 1 (x) bB 1
Tb M
LL
M

LL (x,b)

(
)
M 2 (x)(2k + 1)BM
N 0, 2k+1 2
.
2
(k + 1)f (x)

(3.13)

COROLLARY 2. Under Assumptions A1 and A4,


a.s.
2
LL
M
(x) M 2 (x).

Then for interior x, if b = O(T 2/5 ),

(
2
LL
T b1/2 M
(x, b) M 2 (x) bBM 2

LL (x,b)

where
BM 2

LL (x,b)

(
N

2(M 2 (x))2 BM
0,
x(1 x)f (x)

)
,

(3.14)

[
{
}]
B2 B1
x(1 x)f (x)
2

=
(M (x)) 1 +
.
[B2 (B1 )2 ]
f (x)

Furthermore, for boundary x, if b = O(T 1/3 ), then


( 2
LL (x, b) M 2 (x) bB 2
Tb M
M

LL (x,b)

(
)
4(M 2 (x))2 (2k + 1)BM
N 0,
.
22k+1 2 (k + 1)f (x)

(3.15)

Proofs.
Proof of Lemma 3. Using Lemma 1 and 2, we have

T
b) (Xs )(Xs )

KB(x,

k LX (T, a)
da
b) (a)(a)
2

k d[X, X]s
2 (Xs )

KB(x,
1

(4.1)

(a)

LX (T, a)
KB(x,
2 (a)
0
X (T, x),
= Bk L

b) (a)(a)

da

around a = x.
To prove the main results, it needs to show that
n,T

KB(x,

b) (Xin,T )(Xin,T )

KB(x,

k
b) (Xs )(Xs ) ds

(4.2)

i=1

which is equivalent to
n1
(i+1)T /n [
i=0

KB(x,

k
b) (Xin,T )(Xin,T )

KB(x,

b) (Xs )(Xs )

]
ds

iT /n

n,T KB(x,

k
b) (X0 )(X0 )

+ n,T KB(x,

b) (Xnn,T )(Xnn,T )

0.

(4.3)

The left of (4.3) is bounded by




(i+1)T /n [
n1
]


k
k
KB(x, b) (Xin,T )(Xin,T ) KB(x, b) (Xs )(Xs ) ds



i=0 iT /n








+n,T KB(x, b) (X0 )(X0 )k + n,T KB(x, b) (Xnn,T )(Xnn,T )k 0
n1

(i+1)T /n
is )(X
is )k + kK
is )(X
is )k1 ](Xs Xi ) ds

(
X
[KB(x, b) (X
n,T
B(x, b)
i=0

iT /n



KB(x,




b) (X0 )(X0 ) + n,T KB(x,

+n,T
n1
(i+1)T /n

KB(x,
i=0 iT /n
n1
(i+1)T /n

+n,T

iT /n



KB(x,


b) (Xnn,T )(Xnn,T )
k


is )(X
is )k |Xs Xi |ds
(
X
n,T
b)


kKB(x,

i=0


is )(X
is )k1 (Xs Xi ) ds
(
X
n,T
b)


k
(X
)(X
)
0
0 + n,T
b)



KB(x,


k
(X
)(X
)
nn,T
nn,T
b)

(4.4)

is is the some value between and Xi


where X
and Xs . Let
n,T
kn,T = max

sup

in i
n,T s(i+1)n,T

|Xin,T Xs |.

It follows from the Levy continuity of modulus for diusion process (Bandi and Phillips, 2003) that

1
).
kn,T = Oa.s. ( n,T log
n,T
Using (4.5) and (3.6) we obtain


KB(x,


is )(X
is )k
(
X
b)

(4.5)




= KB(x,



= KB(x,

1
)
n,T

k
(X
+
o
(1))(X
+
o
(1))
.
s
a.s.
s
a.s.
b)
b)

)(

Xs + Oa.s. ( n,T log

Xs + Oa.s. ( n,T

)k

1

log
)

n,T

The rst term on the right hand side of (4.4) becomes


n1

(i+1)T /n

k
kn,T
KB(x, b) (Xs + oa.s. (1))(Xs + oa.s. (1)) ds

= kn,T
0

i=0
T

iT /n



k
K
B(x,
b)(X
+
o
(1))(X
+
o
(1))

ds
s
a.s.
s
a.s.


kn,T 1

b KB(x, b) (a + oa.s. (1))(a + oa.s. (1))k L
X (T, a)da
b
0
kn,T
X (T, x).
Oa.s. L
b

Similarly, we can prove remaining quantities in (4.4). It completes the proof of Lemma 3.
1 (x, b), write
Proof of Theorem 1. For the consistency of M
LL
(
)
n1 LL
w
(x,
b)K
(X
)
X

X
in,T
in,T
B(x, b)
(i+1)n,T
i
i=1
1 (x, b) =
n
M
LL
LL
n,T i=1 wi (x, b)KB(x, b) (Xin,T )
(i+1)n,T
n1
(Xs )ds
i=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T ) in,T
n
=
n,T i=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T )
(i+1)n,T
n1
(Xs )dWs
i=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T ) in,T
n
+
n,T i=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T )
=: A11 + B11 .
Firstly, we examine
n1
A11

=
=
a.s.

(i+1)
[n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T ) in,T n,T (Xs )ds

n,T ni=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T )

n,T n1
i=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T )((Xin,T + oa.s. (1))

n,T ni=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T )


X (T, a)(a)
[B2 (B1 )2 ]L
X (T, a)
[B2 (B1 )2 ]L
i=1

a.s.
(x).
a.s.
It implies that A11 M 1 (x) around a = x. By the strong law of large number for martingale dierence
a.s.
with zero mean and nite variance (Hall and Heyde, 1986), we can show B11 0.
1 (x, b), observe that
To derive the asymptotic normality of M
LL

1
LL
M
(x, b) M 1 (x)
(
)
n1
[
S

S
(X
)]K
(X
)
X

X
n,2
n,1
n,T
n,T
i
i
i
B(x,
b)
(i+1)
n,T
n,T
n,T
i=1
n,T
n
M 1 (x)
n,T i=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T )

10

(
)
(i+1)n,T
1
1 (x)
[
S

S
(X
)]K
(X
)
(X
)ds

M
s
n,T n,2
n,T n,1
in,T
in,T n,T
n,T
B(x, b)
i=1
in,T
n
n,T i=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T )
(i+1)n,T
n1
(Xs )dWs
i=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T ) in,T
n
+
n,T i=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T )
n,T

n1

=: A22 + B22 .

(4.6)

Using Quotient Limit Theorem for Harris recurrent Markov processes (Az
ema et al. 1967), we write
(
)
]
n1 [
(i+1)n,T
1
1
(X
)ds

M
(x)
n,T i=1 n,T Sn,2 n,T Sn,1 (Xin,T ) KB(x, b) (Xin,T ) n,T in,T
s
n,T
]
n [
A22 =
n,T i=1 n,T Sn,2 n,T Sn,1 (Xin,T ) KB(x, b) (Xin,T )
(
)
n1
n,T i=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T ) (Xin,T ) M 1 (x) + oa.s. (1)

=
. (4.7)
n,T ni=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T )
Now, we consider the numerator of (4.7)
[A22 ]num =

n1

[n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x,

i=1
T

KB(x,
0

b) (Xin,T )

b) (Xs )(Xs ) ds

KB(x,

b) (Xs )

)
(Xin,T ) M 1 (x) + oa.s. (1)

)
(Xs ) M 1 (x) ds

T
(
)

KB(x, b) (Xs )(Xs )ds


KB(x, b) (Xs ) (Xs ) M 1 (x) ds
0
0
1
1
(
)
KB(x, b) (a) (a) M 1 (x) s(a)da
KB(x, b) (a)(a)2 s(a)da
=
0
0
1
1
(
)

KB(x, b) (a)(a)s(a)da
KB(x, b) (a) (a) M 1 (x) s(a)da.
T

Similarly, the denumerator of A22 can be written as


(
T
den
2
[A22 ]
=
KB(x, b) (Xs )(Xs ) ds
0

KB(x,

)2
b) (Xs )(Xs )ds

KB(x,

b) (a)(a)s(a)da

KB(x,

2
b) (a)(a) s(a)da

)2

where s(dx) is the where s(dx) is the nite invariant measure. Moreover, Beta distribution has mean
(x+b)/(1+2b) and variance b(x+b)(1x+b)/{(1+2b)2 (1+3b)}. Using the Taylor expansion for (a)M 1 (x)
around a = x, we obtain
[
{
}
]
B2 B1
x(1 x)s (x)
x(1 x)
1

=
(M (x)) 1 +
+
(M (x)) b + o(b).
[B2 (B1 )2 ]
s(x)
2

A22

Next, we consider B22 . The quadratic variation process [B22 , B22 ] yields
(i+1)n,T 2
n1
2 2
(Xs )ds
i=1 [n,T Sn,2 n,T Sn,1 (Xin,T )] KB(x, b) (Xin,T ) in,T
[B22 , B22 ] =
(
)2
n
n,T i=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T )
Let
num

[B22 , B22 ]

(n,T Sn,2 )

n1

2
KB(x,
b) (Xin,T )

i=1

11

(i+1)n,T

in,T

2 (Xs )ds

(4.8)

(4.9)

+(n,T Sn,1 )

n1

2
2
KB(x,
b) (Xin,T )(Xin,T )

i=1

2(n,T Sn,2 )(n,T Sn,1 )


=:

[B22 , B22 ]num


1

Let
n1

b) (Xin,T )

i=1

= n,T

n1

2
KB(x,
b) (Xin,T )

i=1

in,T

2
KB(x,
b) (Xin,T )(Xin,T )

2
KB(x,

2 (Xs )ds

n1

i=1
[B22 , B22 ]num
2

(i+1)n,T

(i+1)n,T

in,T

[B22 , B22 ]num


.
3

(i+1)n,T

2 (Xs )

in,T

1
n,T

(i+1)n,T

2 (Xs )

in,T

=
0

2
KB(x,

b) (Xs

+ oa.s. (1))( 2 (Xs + oa.s. (1))ds.

Dene
Ab (x) = B{(2x/b + 1), 2(1 x)/b + 1}/B 2 {(x/b + 1), (1 x)/b + 1},
then we have

T
2
KB(x,

= Ab (x)

b) (Xs )

(Xs )ds

KB(2x,

b) (Xs )

(Xs )ds

0
1

= Ab (x)

KB(2x,

b) (a)

X (T, a)da.
(a)L

Then
X (T, a).
[B22 , B22 ]num
= (B2 )2 Ab (x) 2 (a)L
1
Following [B22 , B22 ]num
, we derive
1
X (T, a),
[B22 , B22 ]num
= (B1 )2 (B2 )Ab (x) 2 (a)L
2
X (T, a).
[B22 , B22 ]num
= 2(B1 )2 (B2 )Ab (x) 2 (a)L
3
These imply that
X (T, a).
[B22 , B22 ]num = [(B2 )2 (B1 )2 B2 ]Ab (x) 2 (a)L
Similarly, we get
X (T, a))2 .
[B22 , B22 ]den = [B2 (B1 )2 ]2 (L
Then
[B22 , B22 ] =

[(B2 )2 (B1 )2 B2 ]Ab (x) 2 (a)


.
X (T, a)
[B2 (B1 )2 ]2 L
12

2 (Xs )ds

Following from Chen (2000), we can obtain

Ab (x) =

b1/2

2 x(1x)

+ o(b1/2 )

b1 (2k+1)
22k+1 2 (k+1)

o(b1 )

f or interior x
(4.10)

f or boundary x.

Hence, for interior x,


2 (x)BM
a.s.
b1/2 [B22 , B22 ]
X (T, x)
2 x(1 x)L
and

(
X (T, x)B22 N
b1/2 L

2 (x)BM
0,
2 x(1 x)

(4.11)

)
,

(4.12)

with
BM =

[(B2 )2 (B1 )2 B2 ]
.
[B2 (B1 )2 ]2

X (T, x) = Oa.s. (1), then we obtain


If b5/2 L

where BM 1

n,T (x)

X (T, x)
b1/2 L

1
LL
M
(x, b)

M (x) bBM 1
1

LL (x,b)

(
N

M 2 (x)BM
0,
2 x(1 x)

)
,

1 (x, b), given by


denotes bias of the estimator M
LL
BM 1

LL (x,b)

[
{
}]
B2 B1
x(1 x)s (x)
1

=
(M (x)) 1 +
.
[B2 (B1 )2 ]
s(x)

The results for boundary x can be established similarly. It completes the proof of Theorem 1.
2 (x, b), note that
Proof of Theorem 2. For consistency of M
LL
2
2
2Xin,T [X(i+1)n,T Xin,T ]
(X(i+1)n,T Xin,T )2 = X(i+1)
Xi
n,T
n,T
(i+1)n,T
(i+1)n,T
2
(Xs )ds + 2
(Xs Xin,T )(Xs )ds
=
in,T

in,T

(i+1)n,T

+2

(Xs Xin,T )(Xs )dWs .

in,T

2 (x, b), we separate M


2 (x, b) into three parts
Substituting the three terms into M
LL
LL
A33 + B33 + C33 .
We evaluate these terms separately.
The rst term
n1
A33 =

i=1

(i+1)
[n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T ) in,T n,T 2 (Xs )ds

n,T ni=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T )

13

X (T, a) 2 (a)
a.s. [B2 (B1 )2 ]L

X (T, a)
[B2 (B1 )2 ]L
a.s.
2 (x) = M 2 (x)

(4.13)

around a = x.
We now consider B33 . Previous arguments imply that

(i+1)n,T

(Xs Xin,T )(Xs )ds) = Oa.s. (n,T log(

in,T

1
1
)) 2
n,T

(i+1)n,T

(Xs )ds,
in,T

then
n1
B33 =

i=1

(i+1)
[n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T )2 in,T n,T (Xs Xin,T )(Xs )ds

n,T ni=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T )


a.s.
0

(4.14)

a.s.
C33 0.

(4.15)

similarly, we can show

By combining (4.13), (4.14) and (4.15), we get


a.s.
2
LL
M
(x, b) M 2 (x).
2 (x, b).
It shows the consistency of estimator M
LL
2 (x, b), can be obtain by considering
The asymptotic normality of M
LL

2
LL
M
(x, b) M 2 (x)
(i+1)n,T 2

( (Xs ) M 2 (x)n,T )
n,T n1
i=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T ) in,T
n
n,T i=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T )
(i+1)
n1
2 i=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T ) in,T n,T (Xs Xin,T )(Xs )ds

+
n,T ni=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T )
(i+1)n,T

2 n1
(Xs Xin,T )(Xs )dWs
i=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T ) in,T
n
+
n,T i=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T )

=: A44 + B44 + C44 .

(4.16)

Now, we consider
A44 =
=

n,T
n,T

n1
i=1

n1
i=1

(i+1)
[n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T ) in,T n,T ( 2 (Xs ) M 2 (x)n,T )

n,T ni=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T )


[n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T )( 2 (Xin,T ) M 2 (x)) + oa.s. (1)

.
n,T ni=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T )

14

Following (4.7), we get


A44

[
{
}
]
B2 B1
x(1 x)s (x)
x(1 x)
2

=
(M (x)) 1 +
+
(M (x)) b + oa.s. (b).
[B2 (B1 )2 ]
s(x)
2

(4.17)

The proof of B44 is same as B33 . We omit it here.


Next, we consider C44 . The quadratic variation process C44 provides that
(i+1)n,T

2
4 n1
(Xs Xin,T )2 2 (Xs )
i=1 [n,T Sn,2 n,T Sn,1 (Xin,T )] KB(x, b) (Xin,T ) in,T
[C44 , C44 ] =
(
)2

n,T ni=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T )

2
4
42n,T n1
i=1 [n,T Sn,2 n,T Sn,1 (Xin,T )] KB(x, b) (Xin,T )( (Xin,T ) + oa.s. (1))
=
,
(
)2

n,T ni=1 [n,T Sn,2 n,T Sn,1 (Xin,T )]KB(x, b) (Xin,T )


by the same arguments as in (4.9) then, for interior x,
b1/2
2 4 (x)BM
a.s.
[C44 , C44 ]
,
X (T, x)
n,T
x(1 x)L
implies that

If

X (T,x)
b5/2 L
n,T

X (T, x)
b1/2 L
C44 N
n,T

2 4 (x)BM
0,
x(1 x)

)
.

(4.18)

= Oa.s. (1), then we get

)
X (T, x) (
b1/2 L
2
LL
M
(x, b) M 2 (x) bBM 2 (x,b) N
LL
n,T

where
BM 2

LL (x,b)

2(M 2 (x))2 BM
0,
x(1 x)

)
,

[
{
}]
B2 B1
x(1 x)s (x)
2

=
(M (x)) 1 +
[B2 (B1 )2 ]
s(x)

and
BM =

[(B2 )2 (B1 )2 B2 ]
.
[B2 (B1 )2 ]2

Similarly, we can obtain the results for boundary x. It completes the proof of Theorem 2.
The local linear estimators for Gamma kernel function can also be obtained by replacing KB(x,
KG(x/b+1,

b) .

b)

with

For interior x and boundary x a design point x satises x/b and x/b d for some

d > 0 as n, , respectively.
Following Chen (2000), we have

Ab (x) =

b1/2

2 x1/2

+ o(b1/2 )

b1 (2d+1)
22d+1 2 (d+1)

+ o(b1 )

f or interior x
f or boundary x.

Furthermore, it needs to dene


Ab (x) = b1 (2x/b + 1)/22x/b+1 2 (x/b + 1)
15

and [0, ). The remaining procedure is same.


Our ndings indicate that the corollaries can be proved on the same steps as followed in the case of
theorems.

Acknowledgments

I am grateful to the anonymous referees for the valuable comments which improved the quality of this paper.

References

Arapis, M., Gao, J., (2006) Empirical comparisons in short-term interest rate models using nonparametric methods. Journal of Financial Econometrics 4, 310-345.
Azema, J., Kaplan-Duo, M., Revuz, D., (1967) Measure invariante sur les classes recurrents des processus
de Markov. Z.W. 8, 157-181.
Bandi, F. M., Phillips, P., (2003) Fully nonparametric estimation of scalar diusion models. Econometrica
71, 241-283.
Brown, B. M., Chen, S. X., (1999) Beta-Bernstein Smoothing for Regression Curves with Compact Support.
Scandinavian Journal of Statistics 26, 47-59.
Bouezmarni, T., Robin, J. M., (2003) Consistency of the beta kernel density function estimator. The Canadian Journal of Statistics 31, 89-98.
Bouezmarni, T., Scaillet, O., (2005) Consistency of asymmetric kernel density estimators and
smoothed histograms with application to income data. Econometric Theory 21, pp 390412.
Chen, S. X., (2000) Probability density function estimation using Gamma kernels. Annals of the Institute
of Statistical Mathematics, 52, 471-480.
Chen, S. X., (2002) Local linear smoothers using asymmetric kernels. Annals of the Institute of Statistical
Mathematics 54, 312- 323.
Cleveland, W. S., (1979) Robust locally weighted regression and smoothing scatter plots. Journal of the
American Statistical Association 74, 829-836.
Eubank, R. L., (1988) Spline Smoothing and Nonparametric Regression. Marcel Dekker, New York.
Fan, J., (1992) Design-adaptive nonparametric regression. Journal of the American Statistical Association
87, 998-1004.
16

Fan, J., Gijbels, I., (1992) Variable bandwidth and local linear regression smoothers. Annals of Statistics
20, 2008-2036.
Fan, J., Gijbels, I., (1996) Local Polynomial Modeling and Its Applications. Chapman and Hall.
Fan, J., Zhang, C., (2003) A re-examination of diusion estimators with applications to nancial model
validation. Journal of the American Statistical Association 98, 118- 134.
Gospodinovy, N., Hirukawa, M., (2011) Nonparametric estimation of scalar diusion processes of interest
rates using asymmetric kernels, January (2011).
Gustafsonn, J., Hagmann, M., Nielsen, J. P. and Scaillet, O., (2009) Local transformation
kernel density estimation of loss distributions. Journal of Business and Economic Statistics 27, 161-175.
Hardle, W., (1990) Applied nonparametric regression. Cambridge University Press.
Hagmann, M., Scaillet, O., (2007) Local multiplicative bias correction for asymmetric kernel
density estimators. Journal of Econometrics 141, 213-249.
Jiang, G. J., Knight, J. (1997) A nonparametric approach to the estimation of diusion processes with an
application to a short-term interest rate model. Econometric Theory 13, 615-645.
Jones, M. C., Henderson, D. A., (2007) Kernel-Type Density Estimation on the Unit Interval. Biometrika
24, 977-984.
Nicolau. J., (2003) Bias reduction in nonparametric diusion coecient estimation. Econometric Theory 19,
754 - 777.
Protter, P., (1995) Stochastic Integration and Dierential Equations. Springer, New York.
Renault, O., Scaillet, O., (2004) On the way to recovery: A nonparametric bias free estimation
of recovery rate densities. Journal of Banking and Finance 28, 2915-2931.
Revuz, D., Yor, M., (1998) Continuous Martingales and Brownian Motion. Springer, New York.
Ruppert, D., Wand, M. P., (1994) Multivariate locally weighted least squares regression. Ann. Statist. 22,
134-1370.
Scaillet, O., (2004) Density estimation using inverse and reciprocal inverse Gaussian kernels. Journal of
Nonparametric Statistics 16, 217-226.
Shunpu, Z., (2010) A note on the performance of the Gamma kernel estimators at the boundary. Statistics
and Probability Letters 80, 548-557
Stanton, R., (1997) A nonparametric model of term structure dynamics and the market price of interest
rate risk. Journal of Finance 52, 1973-2002.
17

Stone, C. J., (1977) Consistent nonparametric regression. Annals of Statistics 5, 595-645.

18

You might also like