Abstract.
The Berry-Esséen upper bounds of moment estimators and least squares estimators of the mean and drift coefficients in Vasicek models driven by general Gaussian processes are studied. When studying the parameter estimation problem of Ornstein-Uhlenbeck (OU) process driven by fractional Brownian motion, the commonly used methods are mainly given by Kim and Park, they show the upper bound of Kolmogorov distance between the distribution of the ratio of two double Wiener-Itô stochastic integrals and the Normal distribution. The main innovation in this paper is extending the above ratio process, that is to say, the numerator and denominator respectively contain triple Wiener-Itô stochastic integrals at most. As far as we know, the upper bounds between the distribution of above estimators and the Normal distribution are novel.
1. Introduction
Vasicek model is a type of 1-dimensional stochastic processes, it is used in various fields, such as economy, finance, environment. It was originally used to describe short-term interest rate fluctuations influenced by single market factors.
Proposed by O. Vasicek [1], it is the first stochastic process model to describe the “mean reversion” characteristic of short-term interest rates. In the financial field, it can also be used as a random investment model in Wu et al.[2] and Han et al.[3].
Definition 1.
Consider the Vasicek model driven by general Gaussian process, it satisfies the following Stochastic Differential Equation (SDE):
|
|
|
(1) |
where and is a general one-dimensional centered Gaussian process that satisfies 1.
This paper mainly focuses on the convergence rate of estimators of coefficient . Without loss of generality, we assume , then Vasicek model can be represent by the following form:
|
|
|
When the coefficients in the drift function is unknown, an important problem is to estimate the drift coefficients based on the observation.
Based on the Brownian motion, Fergusson and Platen [4] present the maximum likelihood estimators of coefficients in Vasicek model. When the Vasicek model driven by the fractional Brownian motion, Xiao and Yu [5] consider the least squares estimators and their asymptotic behaviors. When , Hu and Nualart [6] study the moment estimation problem.
Since the Gaussian process mainly determines the trajectory properties of Vasicek model. Therefore, following the assumptions in Chen and Zhou [7], we make the following Hypothesis about .
Hypothesis 1 ([7] Hypothesis 1.1).
Let and , Covariance function of Gaussian process satisfies the following condition:
|
|
|
|
(2) |
where
|
|
|
|
are constants independent with . Besides, for any .
Assuming that there is only one trajectory , we can construct the least squares estimators (LSE) and the moment estimators (ME) (See [8, 9, 5, 10] for more details).
Proposition 1 ([11] (4) and (5)).
The estimator of is the continuous-time sample mean:
|
|
|
(3) |
The second moment estimator of is given by
|
|
|
(4) |
Following from Xiao and Yu [5], we present the LSE in Vasicek model.
Proposition 2 ([11] (7) and (8)).
The LSE is motivated by the argument of minimize a quadratic function of and :
|
|
|
Solving the equation, we can obtain the LSE of and , denoted by and respectively.
|
|
|
(5) |
|
|
|
(6) |
where the integral is an Itô-Skorohod integral.
Pei et al.[11] prove the following consistencies and central limit theorems (CLT) of estimators.
Theorem 3 ([11], Theorem 1.2).
When 1 is satisfied, both ME and LSE of are strongly consistent, that is
|
|
|
|
|
|
Theorem 4 ([11], Theorem 1.3).
Assume 1 is satisfied. When is self-similar and , and are asymptotically normal as , that is,
|
|
|
When ,
|
|
|
where
|
|
|
(7) |
Simalarly, is also asymptotically normal as :
|
|
|
|
We now present the main Theorems for the whole paper, and their details are given in the following sections.
Theorem 5.
Let be a standard Normal random variable, and be the constant defined by (7).
Assume and 1 is satisfied. When is large enough, there exists a constant such that
|
|
|
|
(8) |
|
|
|
|
(9) |
where .
Next, we show the convergence speed of mean coefficient estimators and .
Theorem 6.
Assume , and is a self-similar Gaussian process satisfying 1 and . Then there exists a constant such that
|
|
|
|
(10) |
|
|
|
|
(11) |
2. Preliminary
In this section, we recall some basic facts about Malliavin calculus with respect to Gaussian process. The reader is referred to [12, 13, 14] for a more detailed explanation. Let be a continuous centered Gaussian process with and covariance function
|
|
|
(12) |
defined on a complete probability space , where is generated by the Gaussian family .
Denote as the the space of all real valued step functions on . The Hilbert space is
defined as the closure of endowed with the inner product:
|
|
|
(13) |
We denote as the isonormal Gaussian process on the probability space, indexed by the elements in , which satisfies the following isometry relationship:
|
|
|
(14) |
The following Proposition shows the inner products representation of the Hilbert space [15].
Proposition 7 ([7] Proposition 2.1).
Denote as the set of bounded variation functions on . Then is dense in and
|
|
|
where is the Lebesgue-Stieljes signed measure associated with defined as
|
|
|
When the covariance function satisfies 1,
|
|
|
(15) |
Furthermore, the norm of the elements in can be induced naturally:
|
|
|
Proposition 8 ([7] Proposition 3.2).
Suppose that 1 holds, then for any ,
|
|
|
(17) |
and for any ,
|
|
|
|
|
|
|
|
Let and be the -th tensor product and the -th symmetric tensor product of . For every , denote as the -th Wiener chaos of . It is defined as the closed linear subspace of generated by , where is the -th Hermite polynomial. Let such that , then for every and ,
|
|
|
where is the -th Wiener-Itô stochastic integral.
Denote as a complete orthonormal system in . The -th contraction between and is an element in :
|
|
|
The following proposition shows the product formula for the multiple integrals.
Proposition 9 ([12] Theorem 2.7.10).
Let and be two symmetric function. Then
|
|
|
(18) |
where is the symmetrization of .
We then introduce the derivative operator and the divergence operator. For these details, see sections 2.3-2.5 of [12].
Let be the class of smooth random variables of the form:
|
|
|
where , which partial derivatives have at most polynomial growth, and for , . Then, the Malliavin derivative of (with respect to ) is the element of defined by
|
|
|
Given and integer , let denote the closure of with respect to the norm
|
|
|
Denote (the divergence operator) as the adjoint of . The domain of is composed of those elements:
|
|
|
and is denoted by . If , then is the unique element of characterized by the duality formula:
|
|
|
We now introduce the infinitesimal generator of the Ornstein-Uhlenbeck semigroup. Let be a square integrable random variable. Denote as the orthogonal projection on the -th Wiener chaos . The operator is defined by . The domain of is
|
|
|
For any , define . is called the Pseudo-inverse of . Note that and holds for any .
The following Lemma 10 provides the Berry-Esséen upper bound on the sum of two random variables.
Lemma 10 ([16] Lemma 2).
For any variable and , the following inequality holds:
|
|
|
(19) |
where is the standard Normal distribution function.
Using Malliavin calculus, Kim and Park [17] provide the Berry-Esséen upper bound of the quotient of two random variables.
Let be a zero-mean process, and satisfies a.s.. For simplicity, we define the following four functions:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Theorem 11 ([17] Theorem 2 and Corollary 1).
Let be a standard Normal variable. Assuming that for every , has an absolutely continuous law with respect to Lebesgue measure and , , as . Then, there exists a constant such that for large enough,
|
|
|
|
3. Berry-Esséen upper bounds of moment estimators
In this section, we will prove the Berry-Esséen upper bounds of Vasicek model moment estimators and .
For the convenience of the following discussion, we first define :
|
|
|
(20) |
where is a standard Normal variable. Next, we introduce the CLT of .
Theorem 12 ([11] Proposition 4.19).
Assume , and is a self-similar Gaussian process satisfying 1 and . Then is asymptotically normal as :
|
|
|
(21) |
where
|
|
|
is stochastic integral with respect to .
Following from the above Theorem, we can obtain the expanded form of (20):
|
|
|
|
|
|
|
|
Then, we can prove the convergence speed of .
Proof of formula (10).
Let , According to Lemma 10, we have
|
|
|
|
|
|
|
|
Since is self-similar, is standard Normal variable,
|
|
|
Following from Chebyshev inequality, we can obtain
|
|
|
where
|
|
|
The Proposition 3.10 of [11] ensures that is bounded. Combining the above results, we have
|
|
|
(22) |
When is sufficiently large, there exists the constant such that the formula (10) holds.
∎
Similarly, we review the central limit theorem of .
Theorem 13 ([11] Proposition 4.18).
Assume and is a Gaussian process satisfying 1. Then is asymptotically normal as :
|
|
|
The following Lemma shows the upper bound of the expectation of .
Lemma 14.
Let be the process defined by
|
|
|
When , there exists constant independent of such that
|
|
|
(23) |
Proof.
According to (14) and (17), we can obtain
|
|
|
where .
It is easy to see that
|
|
|
|
|
|
|
|
(24) |
where is a constant. Also, we have
|
|
|
|
(25) |
Combining the above two formulas, we obtain (23).
∎
Denote as
|
|
|
|
Then we can obtain the Berry-Esséen upper bound of ME .
Proof of formula (8).
According to [11] Proposition 4.18, we have
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
We denote as the tail probability and
|
|
|
Then we can obtain
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
(26) |
Denote as
|
|
|
|
where , . The Lemma 5.4 of [7] ensures that
|
|
|
Combining with Lemma 15, we obtain the desired result.
∎
The following Lemma provides the upper bound of .
Lemma 15.
When is large enough, there exists constant such that
|
|
|
where .
Proof.
Since the Normal distribution is symmetric, we have
|
|
|
|
|
|
|
|
Consider the following processes:
|
|
|
|
(27) |
|
|
|
|
where is an OU process driven by . According to [11] formula (63), we can obtain
|
|
|
where
|
|
|
|
|
|
|
|
Let . Lemma 10 ensures that
|
|
|
(28) |
According to [7] Theorem 1.4, we have
|
|
|
(29) |
where independent of is a constant. Denote . We then consider the second term of right side of (28).
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Combining the Chebyshev inequality, Lemma 14 and the Proposition 3.10 of [11], we can obtain
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Then we have
|
|
|
(30) |
where is a constant. Combining formulas (29) and (30), we obtain the desired result.
∎
4. Berry-Esséen upper bounds of least squares estimators
For the convenience of following proof, we introduce some variables:
|
|
|
|
|
|
|
|
|
|
|
|
The Proposition 3.10 of [11] and Proposition 9 ensure that
|
|
|
|
|
|
|
|
where is a constant independent of . Also, we show and other functions:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Furthermore, we denote as .
We now extent the Corollary 1 of [17].
Theorem 16.
Let be a zero-mean ratio process that contains at most triple Wiener-Itô stochastic integrals, where , and is a positive function of converging to a constant . Suppose that , , as . Then, there exists constant such that when is large enough,
|
|
|
(31) |
Proof.
We first consider . It is easy to see that .
Then we deal with . Denote , we have
|
|
|
|
|
|
|
|
|
|
|
|
(32) |
|
|
|
|
|
|
|
|
Following from the the orthogonality property of multiple integrals, we can obtain
|
|
|
|
(33) |
where is a constant independent of .
We next consider and . Denote , we have
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Simalarly, there exists a constant such that
|
|
|
|
(34) |
Following from the above result, we can obtain the bound of .
We then deal with . According to the orthogonality and (32), we have
|
|
|
|
(35) |
Combining formulas (33), (34) and (35), we obtain the desired result.
∎
We now prove the convergence speed of . First, we review the CLT of .
Theorem 17 ([11] Propositions 4.20).
When and 1 holds, satisfies the following central limit theorem:
|
|
|
|
(36) |
We then transfrom the above as multiple Wiener-Itô integrals.
Proposition 18.
Let be a OU process driven by . can be rewritten as:
|
|
|
(37) |
where
|
|
|
|
|
(38a) |
|
|
|
|
(38b) |
|
|
|
|
(38c) |
|
|
|
|
(38d) |
|
|
|
|
(38e) |
|
|
|
|
(38f) |
Proof.
We first deal with the numerator of (36):
|
|
|
(39) |
According to the definition of above functions, . Combining with Proposition 9, we can obtain
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Then the second term of (39). Let , we have
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Besides, we can obtain
|
|
|
|
|
|
|
|
Next, we consider the denominator. For , we have
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Since , we can obtain
|
|
|
|
|
|
|
|
|
|
|
|
Combining the above formulas, we obtain the desired result.
∎
For simplicity, let . We show the convergence speed of zero-mean part.
Lemma 19.
Let be a standard Normal variable. Assume and 1 holds. When is large enough, there exists constant such that
|
|
|
(40) |
where .
Proof.
According to [11] formulas (9) and (47),
|
|
|
Combining with [12] Lemma 5.2.4, we can obtain
|
|
|
(41) |
where is a constant, and .
Following from [7] Theorem 1.4 and (41), we have
|
|
|
where is a constant independent of .
Minkowski inequality ensures that . Simalarly, the Proposition 3.10 of [11] induces that
|
|
|
(42) |
and . The formula (5.13) of[7] ensures that
|
|
|
Combining the above results, Theorem 16 and Cauchy-Schwarz inequality, we obtain the Lemma.
∎
The following Lemma shows the bound of non-zero mean part.
Lemma 20.
Assume and 1 holds. When is large enough, there exists constant such that
|
|
|
where .
Proof.
The Proposition 3.14 and Corollary 3.15 of [11] ensure that when is large enough,
|
|
|
(43) |
Then we can obtain
|
|
|
where is a constant. Furthermore, According to (38a), there exists such that
|
|
|
Combining the above two formulas, we have
|
|
|
where . Then we obtain the desired result.
∎
We now prove the formula (9).
Proof of formula (9).
According to Lemma 10,
|
|
|
|
|
|
|
|
Combining Lemmas 19 and 20, we obtain the desired result.
∎
Pei et al.[11] show the CLT of least squares estimator of mean coefficient .
Theorem 21 ([11] Propositions 4.21).
Assume and is a self-similar Gaussian process satisfing 1 and . is asymptotically normal as :
|
|
|
(44) |
We also transform as the following multiple integrals.
Proposition 22.
can be represented by as :
|
|
|
where
|
|
|
|
(45a) |
|
|
|
|
|
|
(45b) |
|
|
|
(45c) |
|
|
|
(45d) |
|
|
|
(45e) |
|
|
|
(45f) |
|
|
|
(45g) |
Proof.
We first consider the denominator.
|
|
|
According to Proposition 18,
|
|
|
|
|
|
|
|
|
|
|
|
Then, we deal with :
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
We next consider the numerator:
|
|
|
Since the Proposition 18, we have
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Similarly, we can obtain
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Let , we have
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Combining the above formulas, we obtain the Proposition.
∎
Denote and .
We now prove the upper bounds of zero-mean part.
Lemma 23.
Let be a standard Normal variable. Assume and is a self-similar Gaussian process satisfing 1 and . When is large enough, there exists a constant such that
|
|
|
(46) |
Proof.
According to Lemma 19, (45f) and (45g), we have
|
|
|
where is a constant independent of . It is easy to see that Lemma 19 and equation (45e) implies . Furthermore, when is large enough, there exists constant such that
|
|
|
|
(47) |
Since is self-similar, Lemma 19 implies . Also, we can obtain
|
|
|
(48) |
Combining above three formulas, we have
|
|
|
Following from Theorem 16 and Cauchy-Schwarz inequality, we obtain the result.
∎
We next consider the non-zero mean part.
Lemma 24.
Assume and is a self-similar Gaussian process satisfing 1 and . Let . When is large enough, there exists a constant independent of such that
|
|
|
Proof.
Following from Lemma 23, we have , where is a constant independent of . Furthermore, Equation (45a) implies that there exists such that
|
|
|
where . Combining the above formulas, we obtain the desired result.
∎
We now prove the formula (11).
Proof of formula (11).
Following from Lemma 10, we have
|
|
|
|
|
|
|
|
|
|
|
|
Combining Lemmas 23 and 24, we obtain the formula (11).
∎