Indirect inference for locally stationary ARMA processes with stable innovations
Indirect inference for locally stationary ARMA processes with stable innovations
Abstract
The class of locally stationary processes assumes that there is a time-varying spectral representation, that is, the existence of finite second moment. We propose the -stable locally stationary process by modifying the innovations into stable distributions and the indirect inference to estimate this type of model. Due to the infinite variance, some of interesting properties such as time-varying autocorrelation cannot be defined. However, since the -stable family of distributions is closed under linear combination which includes the possibility of handling asymmetry and thicker tails, the proposed model has the same tail behavior throughout the time. In this paper, we propose this new model, present theoretical properties of the process and carry out simulations related to the indirect inference in order to estimate the parametric form of the model. Finally, an empirical application is illustrated.
keywords:
Locally stationary process; stable distribution; indirect inference1 Introduction
The class of locally stationary processes describes processes that are approximately stationary in a neighborhood of each time point but its structure, such as covariances and parameters, gradually changes throughout the time period [1, 2]. This type of processes has been proved to achieve meaningful asymptotic theory by applying infill asymptotics. The idea of this approach is that the time-varying parameters are rescaled to the unit interval, and thus, more available observations imply obtaining more contribution for each local structure. Consequently, statistical asymptotic results such as consistency, asymptotic normality, efficiency, locally asymptotically normal expansions, etc. are obtained. [3] provided a review of this type of process.
Most results of locally stationary processes assume innovations with finite second moment. However, different areas have observed phenomena with heavy tail distributions or infinite variance. In this work, we consider that the innovations of the locally stationary process follow stable distributions. The advantage of assuming stable distributions is its flexibility for asymmetry and thick tails. Also, it is closed under linear combinations and includes the Gaussian distribution as a special case. However, its estimation is difficult since the density function does not have a closed-form. Consequently, the usual estimation methods such as maximum likelihood and method of moments do not work.
Alternative estimation approaches such as methods based on quantiles [4] or on the empirical characteristic function [5] are proposed. However, those methods are only useful for the estimation of the stable distributions parameters and, therefore, they are difficult to apply for more complex models.
The strategy to estimate this kind of process is the indirect inference proposed by [6] and [7]. Since stable distributions can be easily simulated, the indirect approach, which is an intensive computationally simulation based method, can be a solution to overcome the estimation problem.
Models involving stable distribution were successfully implemented in indirect inference for independent samples from the -stable distributions and stable ARMA processes [8]. Moreover, some time series models involving stable distributions are also successfully implemented using indirect inference [9, 10, 11, 12].
Our contribution in this work is twofold. First, we propose the locally stationary processes with stable innovations and present the theoretical properties of this model. We also justify the reason why we call them stable locally stationary processes. Second, we propose the indirect inference in order to estimate the models with linear time-varying coefficient.
The paper is organized as follows. In Section 2, we review the basic background on locally stationary processes, stable distribution and indirect inference. Then, properties of the stable locally stationary processes are presented in Section 3. Section 4 describes the indirect inference for this kind of processes. Simulations are performed to study the indirect inference approach in Section 5. A wind data application is illustrated in Section 6. Finally, conclusions are presented in Section 7.
2 Background
2.1 Locally stationary processes
Locally stationary processes were introduced by using a time-varying spectral representation [2]. However, we use the time-domain version as in [13] since stable distributions do not have finite second moment.
Definition 2.1.
The sequence of stochastic processes is a linear locally stationary processes if has a representation
(1) |
where the following conditions are satisfied:
-
(i)
(2) -
(ii)
Let be the total variation of a function on ; then, there exist funcions with
(3) (4) (5) -
(iii)
are i.i.d. with , for and .
Note that the condition (iii) in the Definition 2.1 assumes that the innovations has zero mean and unit variance. In this case, there exists a spectral representation of the process and time-varying spectral density. Note that classical stationary processes arise as a special case when all parameter curves are constant.
2.2 stable distribution
Stable distributions, as an extension of Gaussian distributions, can be defined by its characteristic function:
(6) |
where is the index of stability (tail heaviness), is the scale parameter, the asymmetry parameter and the location parameter. It is denoted by . Important properties can be found in detail in [14].
This class of distribution generalizes some important known distributions: normal distribution for , Cauchy distribution () and Lévy distribution (). However, it does not have a closed-form density function in general. Moreover, the non-existence of moments greater than makes it difficult to estimate parameters. Different estimation approaches have been proposed, such as methods based on quantiles [4] and methods based on the empirical characteristic function [5]. Nevertheless, they are only useful for independent samples from stable distributions and difficult to perform for more complex models.
When a random variable has the density function and distribution function, its simulation is an easy task. In stable distribution case, [15] proposed an algorithm to generate -stable distribution. To simulate a random variable :
-
1.
generate a random variable uniformly distributed on , and a independent exponential random variable with mean , then
-
2.
let and , and compute
(7)
Next, is obtained by means of the standardization formula:
(8) |
2.3 Indirect inference
The indirect inference proposed by [6] is based on a very simple idea and it is suitable for situations where the direct estimation of the model is difficult. Let be a sample of observations from a model of interest (IM) and be the maximum likelihood estimator (MLE) of in IM which is unavailable. Then, consider the auxiliary model (AM) depending on a parameter vector whose likelihood function is easier to handle, but its MLE is not necessarily consistent. The indirect inference is carried out by the following steps:
- Step 1
-
Compute based on .
- Step 2
-
Simulate a set of vectors of size from the IM on the basis of an arbitrary parameter vector . Let us denote each of those vectors as .
- Step 3
-
Then, estimate parameters of the AM using simulated values from the IM,
(9) - Step 4
-
Numerically update the initial guess in order to minimize the distance
(10) where is a symmetric nonnegative matrix defining the metric.
For choosing , [6] proved that when the parameter vectors of both AM and IM have the same dimension and is sufficiently large, the estimator does not depend on the matrix . In this paper, we consider as identity matrix. Finally, the estimation step is performed with a numerical algorithm, such as Newton-Raphson. Then, for a given estimate , the procedure yields and the process will be repeated until the series of converges. The estimator is then given by
(11) |
3 tvARMA with stable innovations
An important example of the locally stationary process is the time-varying ARMA model, briefly tvARMA. In this section, we will consider this model with stable innovations.
Definition 3.1 (tvARMA with stable innovations).
Consider the system of difference equations
(12) |
where are i.i.d. and with . Assume and , for . Suppose also that all and , as well as , are of bounded variation.
The reason that the scale parameter of the innovations is set to be is when , the standardized Gaussian innovation is obtained. It is possible to define the equation (12) as:
(13) |
where ; and are the autoregressive (AR) and moving average (MA) operators, respectively.
There are several works related to stable linear processes. For instance, chapter 7 in [16] and Chapter 13 in [17] give a general review of stable linear processes. [18] study the infinite variance stable ARMA procseses and [19] study fractional ARIMA with stable innovations. [20] proposed a Whittle-type estimator to estimate the coefficients of the ARMA model. In the stable innovation and time-varying coefficient context, [21, 22] considered the univariate and multivariate case of the system (13) with symmetric stable innovations and assume . However, they considered time-dependent coefficient without the local stationarity condition.
3.1 Existence and Uniqueness of a Solution
Before we study the local stationarity conditions on the time-varying coefficients, we present a set of regularity conditions of existence and uniqueness of solution of the system based on the concepts defined by [21, 22].
Definition 3.2.
The random series in (14) converges a.s. if and only if , and by applying the Proposition 13.3.1 in [17], it converges absolutely if and only if with . Similar arguments are applied to the MA representation in (15).
To continue, we omit the subscript from above notation. Consider the homogeneous difference equation
(16) |
If for any , there exist linearly independent solution such that
(17) |
is invertible for any [23]. Therefore, we can define
(18) |
the one-sided Green’s function matrix associated with the AR operator . It can be showed that is unique and invariant under different solutions obtained from the homogeneous difference equation (16). Furthermore, the one-sided Green’s function associated with the AR operator is defined as the upper left-hand element in the matrix (18),
(19) |
which is also unique and invariant. Now, we are ready to establish the conditions for AR regularity and MA regularity.
Theorem 3.3.
Let be a sequence of stochastic process that satisfies (13). Suppose that for all , and , the one-sided Green’s functions associated with , is such that , for all . Assume also that for all , and for and have no common roots. Then, there is a valid solution, given by
(20) |
to (13) with coefficients uniquely determined by
(21) |
Proof.
By setting , along with the absolute convergence conditions above, the proof is similar to [22]. ∎
Theorem 3.4.
Let be a sequence of stochastic process that satisfies (13). Suppose that for all , and , the one-sided Green’s function associated with , is such that , for all . Assume also that for all , and and for have no common roots. Then, the process (13) is invertible and its explicit inversion is given by
(22) |
where denotes an arbitrary solution and the coefficients are uniquely determined by
(23) |
Theorem 3.5.
Proof.
The explicit form of the solution is straightforward since the linear combination of stable distributions is also stable. Moreover, the Property 1.2.6 from [14] implies that for each , the solution is strictly stable since each of them has location parameter equals . ∎
3.2 Local Stationarity
Similar to the Proposition 2.4 in [13], we can present the corresponding version for stable innovations. Since it is not a second-order process, the time-varying spectral density does not exist.
Theorem 3.6.
Proof.
We give the proof for tvAR(p) process (i.e. ) and then the extension to tvARMA(p,q) is straightforward. Since the process (12) is AR regular, there exists a solution of the form
that is well defined and the coefficients are given by
with
[for more detail see 23]. Then, the proof of the existence of the functions satisfying (3), (4) and (5) follows the same proof to that of finite innovation case (see Appendix in [13]). ∎
Remark 1.
-
1.
Note that since , can be approximated by
(24) which converges a.s. if and only if
Moreover, , with
-
2.
in (14) can be expressed as a linear combination of -stable random variables and is strictly stable with the same index of stability .
-
3.
Observe that is not strictly stationary, but it can be approximated by which is locally (strictly) stationary and strictly stable with the same index of stability.
-
4.
Weak stationarity does not make sense since the second moment does not exist. Consequently, the time-varying spectral representation does not exist.
-
5.
Let be the sequence of solutions defined in (14) and be the sequence of the stochastic process defined in (24). Both processes are strictly stable, since all linear combinations are strictly stable with the same index of stability. This means that the weak stationarity is lost but it is substituted by the same tail behavior throughout the time. This is the reason we call this process stable locally (strictly) stationary process.
If we consider the symmetric stable () innovations, i.e. , the simplest form is obtained.
Corollary 3.7 (tvARMA with symmetric stable innovations).
In the symmetric case, in addition to the properties in Remark 1, note that the processes and are symmetric stable.
3.3 Some examples
Example 3.8.
The tvMA(q) model with stable innovations:
(25) |
Example 3.9.
Consider the tvAR(p) model with stable innovations
(26) |
Underregularity conditions, does not have a solution of the form
but only of the form (14) with
(27) |
and
where for [13]. Moreover, can be approximated by which satisfies (3), (4) and (5).
Figure 1 presents simulated tvAR(1) process of observations with different innovation distribution (Gaussian, and symmetric stable innovations, ) and a linear coefficient and . We observe that for smaller , the process seems to have more outliers.

3.4 Prediction
Recalling that stable tvARMA has infinite variance, prediction results based on stable ARMA processes with dependent coefficients are presented by [21, 22]. Then, it is possible to predict future values along with the approach applied by [24], which considers the observed values and rescaling the time interval to , where is the forecasting horizon and the ratio tends to zero as tends to infinity. Here, we consider that the innovations are random variables.
Suppose that we have the system of difference equation (12) that satisfies above regular conditions, and with are observed. We are interested in predictions with horizon , i.e. .
Since is AR regular, it can be expressed as
(28) |
Let be the best linear predictor of for , namely
(29) |
where are some functions. Since the prediction error is also random variable, it is possible to define its dispersion as with its scale parameter. The idea is to minimize the dispersion . Note that
(30) |
Then, assuming and using properties of random variables, its dispersion is
(31) |
Minimizing the expression (31), we obtain the following theorem.
Theorem 3.10.
The minimum dispersion predictor is given by
(32) |
Proof.
4 Indirect inference for stable tvARMA processes
The IM is the tvARMA with innovation . We study the parameter estimation when the innovation parameters and are known. Then, we study the case assuming unknown .
Suppose that the parameter curves of model of interest can be parametrized by a finite-dimensional parameter . The estimation strategy is to consider an auxiliary model with the same parametric time varying coefficient structure with Student’s t innovations. The conditional likelihood estimates, defined in the equation in [3], is,
(33) |
where . In practice, we will use t distribution with degrees of freedom for the case of known parameters since its tail is heavier than the Gaussian one. For unknown case, we let to be estimated in the AM.
5 Simulation study
This section presents a Monte Carlo (MC) simulation in order to investigate the properties of the indirect inference estimators. All the simulation programs and routines were implemented in R language. We present one scenario for each of the following models but still different values of were selected. Some other scenarios were performed for each case and similar results were obtained and they are available upon request from the authors. For each scenario, simulations were done for , and observations based on independent replications. The indirect inference was carried out using . For known, we also performed the blocked Whittle estimation (BWE), proposed by [2], to compare the estimation of time structure of the model with indirect inference. The suggestion of block size and shifting each block by time units from [25] is used.
5.1 Known case
5.1.1 -stable tvAR(1)
We illustrate how the indirect inference can be employed to the tvAR(1) with the linear parametric form of the time varying coefficient , and we consider that for known. Therefore, the parameters of the IM is .
The simulation was performed by assuming known parameters and and unknown . It is important to report that since is close to , all replications for the BWE converged. This outcome is expected since the innovation distributions approximate to the Gaussian distribution for close to .
Table 1 reports the MC mean and standard error of both estimation methods. Notice that the MC mean from the indirect estimates seems to be consistent, that is, they approximate the real parameters and present lower standard errors as increases. On the other hand, the MC mean of the BWE are different from the real parameters and present higher standard errors compared to our estimation approach.
Table 2 presents the kurtosis and skewness of all estimates from both methods. In general, all indirect estimates present lower kurtosis and the skewness close to . Notice that since the second moment of the process does not exist, the parameter estimates from the BWE present highly positive asymmetry and they subestimate the true parameter.
Indirect estimates | BWE | |||||
---|---|---|---|---|---|---|
-0.2952 | 0.7897 | 0.9966 | -0.2880 | 0.7825 | 1.2086 | |
(0.0881) | (0.1523) | (0.0366) | (0.1172) | (0.2216) | (0.6352) | |
-0.2975 | 0.7926 | 0.9996 | -0.2917 | 0.7845 | 1.2197 | |
(0.0585) | (0.1028) | (0.0260) | (0.0811) | (0.1545) | (0.4734) | |
-0.2974 | 0.7958 | 0.9997 | -0.2940 | 0.7926 | 1.2709 | |
(0.0494) | (0.0793) | (0.0209) | (0.0639) | (0.1162) | (0.8738) |
Indirect estimates | BWE | ||||||
---|---|---|---|---|---|---|---|
Kur | 3.0330 | 2.8565 | 3.1076 | 3.3783 | 3.0944 | 375.8563 | |
Skw | 0.1388 | -0.1354 | 0.1241 | -0.0129 | -0.0875 | 16.6778 | |
Kur | 3.1678 | 3.2835 | 2.7390 | 2.8722 | 2.9543 | 95.2261 | |
Skw | 0.0341 | -0.0260 | 0.0437 | 0.0057 | -0.1047 | 8.4487 | |
Kur | 3.1024 | 3.0645 | 2.9329 | 3.7487 | 6.1799 | 187.9560 | |
Skw | -0.0299 | 0.0248 | 0.0026 | 0.1707 | -0.5935 | 12.4661 |
Figure 2 shows the density estimates of each parameter. They show that the standard error become smaller as increases. Along with the results from Tables 1 and 2, we can conclude that indirect estimates behave better than the BWE in terms of mean, standard error, skewness and kurtosis. Therefore, the simulation results show that the indirect inference performs well.

5.1.2 -stable tvMA(1)
In this section, we carried out simulations for a tvMA(q) in (25) with and :
(35) |
where with and known.
The indirect inference is employed for the linear parametric form of the time varying coefficient , and we consider that for known and . Hence, the vector of parameters of the model of interest is .
This scenario assumes known and and unknown . For the BWE case, we consider only and replications with converged estimates for , and , respectively. This result is expected because BWE assumes finite second moment. The MC mean, standard error, kurtosis and skewness of estimates from the simulation are reported in the Table 3 and 4 and the density estimates in Figures 3.
Similarly to the previous case, the indirect estimates seem to be consistent and the standard error become smaller as increases. For this case, since is smaller, the distribution of indirect estimates has heavier tails, and they have similar kurtosis and skewness than the BWE estimates, except for the parameter , when the indirect estimation behaves better. In addition, in term of standard error and MC mean, they still behave better than the BWE. We conclude that the indirect inference has a good performance.
Indirect estimates | BWE111In tvMA(1) simulations, the BWE did not converge in some cases. Therefore, excluding those cases, and replications are included for , and , respectively. | |||||
---|---|---|---|---|---|---|
0.3561 | -0.5888 | 1.1989 | 0.3424 | -0.5427 | 18.7932 | |
(0.0298) | (0.0577) | (0.0600) | (0.1418) | (0.3084) | (38.4343) | |
0.3545 | -0.5953 | 1.1986 | 0.3386 | -0.5532 | 47.5752 | |
(0.0186) | (0.0352) | (0.0412) | (0.0870) | (0.1955) | (232.0620) | |
0.3536 | -0.5982 | 1.1986 | 0.3357 | -0.5555 | 49.4572 | |
(0.0131) | (0.0244) | (0.0331) | (0.0747) | (0.1690) | (178.3984) |
Indirect estimates | BWE11footnotemark: 1 | ||||||
---|---|---|---|---|---|---|---|
Kur | 7.9023 | 6.3460 | 2.9050 | 6.9952 | 7.3434 | 233.1652 | |
Skw | 1.2950 | 0.0800 | 0.2117 | 0.1961 | 1.1869 | 13.5324 | |
Kur | 9.8633 | 10.4926 | 2.8616 | 11.7454 | 8.6755 | 385.9194 | |
Skw | 1.6510 | 0.7121 | 0.0841 | 0.8633 | 1.5096 | 17.6374 | |
Kur | 8.1466 | 20.6873 | 2.8140 | 9.7011 | 10.5014 | 156.1843 | |
Skw | 1.5194 | 1.4762 | 0.1493 | -0.1837 | 1.9926 | 11.4943 |

5.1.3 -stable tvARMA(1,1)
The third simulation was carried out with the case of tvARMA(p,q) with , and :
(36) |
where with and known.
We suppose a linear parametric form of the time varying coefficients and . Therefore, the parameters of the IM is .
The simulation was done by assuming , and . For BWE, and replications with converged estimates are included for , and , respectively.
The MC mean, standard error, kurtosis and skewness of estimates from the tvARMA(1,1) simulation are reported in the Table 5 and 6 and the density estimates in Figure 4. In general, the distribution of indirect estimates has heavier tails, and the kurtosis and skewness are similar to the BWE (except for the parameter , indirect estimates behave better). However, in terms of standard error and MC mean, they behave much better than the BWE. Therefore, the indirect inference works well for tvARMA(1,1).
Indirect estimates | BWE222In tvARMA(1,1) simulations, the BWE did not converge in some cases. Therefore, excluding those cases, and replications are included for , and , respectively. | |||||||||
---|---|---|---|---|---|---|---|---|---|---|
-0.4000 | 0.1061 | 0.0987 | 0.3097 | 0.9976 | -0.3917 | 0.1021 | 0.1078 | 0.3049 | 1.4151 | |
(0.1360) | (0.2222) | (0.1501) | (0.2395) | (0.0386) | (0.1952) | (0.3522) | (0.2130) | (0.3810) | (0.7603) | |
-0.3921 | 0.0881 | 0.1064 | 0.2905 | 0.9982 | -0.3850 | 0.0815 | 0.1105 | 0.2880 | 1.4919 | |
(0.1001) | (0.1617) | (0.1053) | (0.1652) | (0.0290) | (0.1409) | (0.2535) | (0.1470) | (0.2599) | (0.5806) | |
-0.3992 | 0.1021 | 0.0988 | 0.3060 | 0.9982 | -0.3939 | 0.0926 | 0.1055 | 0.2964 | 1.5538 | |
(0.0754) | (0.1269) | (0.0793) | (0.1285) | (0.0232) | (0.1085) | (0.1955) | (0.1155) | (0.2040) | (0.8009) |
Indirect estimates | BWE222In tvARMA(1,1) simulations, the BWE did not converge in some cases. Therefore, excluding those cases, and replications are included for , and , respectively. | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Kur | 3.3650 | 3.1791 | 3.3657 | 3.4297 | 2.9935 | 2.9112 | 3.0692 | 3.2168 | 3.2822 | 203.2823 | |
Skw | 0.2754 | -0.2426 | -0.0746 | -0.1274 | 0.1839 | 0.1699 | -0.1329 | -0.2024 | -0.0422 | 12.0737 | |
Kur | 3.3964 | 3.5054 | 3.4470 | 3.4690 | 3.0327 | 3.4242 | 3.2166 | 3.2601 | 3.0064 | 41.6803 | |
Skw | 0.2002 | -0.1558 | 0.0100 | -0.1184 | 0.2460 | 0.3149 | -0.2253 | 0.0267 | -0.1713 | 4.9608 | |
Kur | 3.5817 | 3.1935 | 3.7052 | 3.3930 | 2.9790 | 2.9176 | 2.8801 | 3.3685 | 3.3091 | 96.2273 | |
Skw | 0.2895 | -0.1097 | 0.0137 | -0.0730 | 0.0718 | 0.0809 | 0.0268 | -0.1083 | 0.0372 | 7.7396 |

5.2 Unknown case
5.2.1 -stable tvAR(1)
Consider the tvAR(1) model
(37) |
where with known . Here, the indirect inference is employed to the tvAR(1) in (34) with the linear parametric form of the time varying coefficient , and . The parameters of IM is . For AM, the same parametric form with the t-distribution assuming unknown is used, that is, .
The simulation was performed by assuming . Table 7 reports the MC mean and standard error of the estimates. Notice that the MC mean from the indirect estimates seems to be consistent. Table 8 presents the kurtosis and skewness of indirect estimates. All indirect estimates do not present kurtosis close to 3 and the skewness close to 0. Indeed, they are similar to the case when is known.
T | Indirect estimates | |||||||||
---|---|---|---|---|---|---|---|---|---|---|
Model of Interest | Auxiliary model | |||||||||
0.3482 | -0.5980 | 1.4083 | 0.4922 | 0.1111 | 0.3482 | -0.5980 | 1.8853 | 0.3994 | 0.0897 | |
(0.0406) | (0.0715) | (0.0737) | (0.0527) | (0.0960) | (0.0407) | (0.0716) | (0.2351) | (0.0446) | (0.0778) | |
0.3492 | -0.5986 | 1.4037 | 0.4974 | 0.1033 | 0.3492 | -0.5986 | 1.8622 | 0.4033 | 0.0834 | |
(0.0244) | (0.0430) | (0.0520) | (0.0370) | (0.0661) | (0.0244) | (0.0429) | (0.1570) | (0.0311) | (0.0533) | |
0.3498 | -0.5988 | 1.4000 | 0.4976 | 0.1011 | 0.3499 | -0.5988 | 1.8478 | 0.4030 | 0.0818 | |
(0.0187) | (0.0323) | (0.0417) | (0.0305) | (0.0546) | (0.0187) | (0.0323) | (0.1244) | (0.0255) | (0.0441) |
T | Indirect estimates | |||||
---|---|---|---|---|---|---|
kur | 3.7767 | 3.6800 | 3.1078 | 2.8924 | 3.0343 | |
skw | -0.1369 | 0.1213 | 0.2730 | 0.1583 | 0.0156 | |
kur | 4.7209 | 3.8513 | 2.7680 | 3.1397 | 3.0710 | |
skw | -0.1548 | 0.1008 | 0.0889 | 0.0672 | -0.0654 | |
kur | 4.2029 | 3.8664 | 2.7385 | 3.0881 | 3.0266 | |
skw | 0.1274 | -0.0192 | 0.0967 | 0.0973 | 0.0108 |
Finally, Figure 5 shows the density estimates of each parameter. The density estimates show that the standard error become smaller as increases. We conclude that the distribution of indirect estimates seem to be consistent for these sample path length.

5.2.2 -stable tvMA(1)
The indirect inference for the model (35) with unknown is illustrated. The parameter of IM is and the parameter of AM is . The simulation was performed by assuming .
The MC mean and standard error of the estimates from both model (IM and AM) are reported in Table 9, and kurtosis and skewness are presented in Table 10. Along with the density estimates showed in Figures 6, the indirect estimates seem to be consistent with these sample path length. One interesting result is that while implies the IM has infinite variance, the AM was estimated with , i.e. finite variance.
T | Indirect estimates | |||||||
---|---|---|---|---|---|---|---|---|
Model of Interest | Auxiliary model | |||||||
-0.3518 | 0.4016 | 1.7566 | 0.7008 | -0.3518 | 0.4016 | 3.9795 | 0.3810 | |
(0.0699) | (0.1245) | (0.0739) | (0.0296) | (0.0694) | (0.1237) | (1.0183) | (0.0390) | |
-0.3487 | 0.3987 | 1.7527 | 0.6999 | -0.3486 | 0.3987 | 3.8307 | 0.3776 | |
(0.0446) | (0.0787) | (0.0559) | (0.0229) | (0.0445) | (0.0788) | (0.6414) | (0.0299) | |
-0.3504 | 0.4009 | 1.7525 | 0.7003 | -0.3502 | 0.4007 | 3.7874 | 0.3785 | |
(0.0375) | (0.0663) | (0.0457) | (0.0187) | (0.0373) | (0.0661) | (0.4852) | (0.0242) |
T | Indirect estimates | ||||
---|---|---|---|---|---|
kur | 3.8667 | 3.2718 | 2.8731 | 3.3445 | |
skw | -0.0413 | -0.0030 | -0.1140 | 0.0003 | |
kur | 3.7260 | 3.4565 | 2.8049 | 2.9412 | |
skw | 0.0436 | 0.0582 | -0.0081 | 0.1446 | |
kur | 3.6876 | 3.4211 | 3.0489 | 3.0002 | |
skw | -0.0043 | 0.0187 | -0.2133 | 0.0557 |

5.2.3 -stable tvARMA(1,1)
Finally, the simulation was done for the case of tvARMA(1,1) in (36), but is assumed to be unknown. The time varying coefficients are assumed to be linear, i.e. and , and for known . Therefore, the parameters of IM is , while AM has the parameter . The simulation was performed by assuming .
The MC mean and standard error of the estimates from both IM and AM are reported in Table 11, and kurtosis and skewness are presented in Table 12. The density estimates are showed in Figures 6. Again, the indirect estimates seem to be consistent. Moreover, if we compare with simulation results from the known , they present similar standard error, kurtosis and asymmetry.
T | |||||||
---|---|---|---|---|---|---|---|
Model of Interest | -0.2036 | -0.3932 | 0.1971 | 0.3064 | 1.3018 | 1.0923 | |
(0.0585) | (0.0869) | (0.0587) | (0.0891) | (0.0698) | (0.0587) | ||
-0.2005 | -0.3986 | 0.2003 | 0.3004 | 1.3045 | 1.0976 | ||
(0.0319) | (0.0489) | (0.0329) | (0.0504) | (0.0471) | (0.0433) | ||
-0.1998 | -0.3999 | 0.2012 | 0.2983 | 1.2998 | 1.0953 | ||
(0.0233) | (0.0359) | (0.0250) | (0.0374) | (0.0390) | (0.0347) | ||
T | |||||||
Auxiliary model | -0.2036 | -0.3936 | 0.1971 | 0.3062 | 1.5904 | 0.7465 | |
(0.0584) | (0.0864) | (0.0586) | (0.0889) | (0.1731) | ( 0.0940) | ||
-0.2006 | -0.3986 | 0.2003 | 0.3006 | 1.5917 | 0.7542 | ||
(0.0319) | (0.0483) | (0.0329) | (0.0505) | (0.1160) | (0.0668) | ||
-0.1998 | -0.4009 | 0.2012 | 0.2983 | 1.5772 | 0.7487 | ||
(0.0232) | (0.0344) | (0.0250) | (0.0374) | (0.0947) | (0.0540) |
T | Indirect estimates | ||||||
---|---|---|---|---|---|---|---|
kur | 5.2893 | 4.5345 | 6.0807 | 5.5860 | 3.0705 | 3.3815 | |
skw | 0.2593 | -0.1945 | -0.1951 | 0.1297 | 0.1406 | 0.2709 | |
kur | 4.6288 | 4.1073 | 4.5984 | 4.1306 | 3.4796 | 2.9077 | |
skw | -0.1406 | 0.1144 | 0.0360 | -0.0328 | 0.1167 | -0.0713 | |
kur | 4.9301 | 4.0790 | 5.1471 | 4.5091 | 3.1301 | 3.0701 | |
skw | 0.0236 | -0.1964 | 0.0964 | -0.2378 | 0.1004 | 0.0833 |

6 Application
In this section, we illustrate an application for wind power generated in German offshore wind farms from 16/06/2015 at 00:00 to 27/07/2015 at 24:00 ( hours), obtained from the EMHIRES (European Meteorological High resolution RES time series) datasets [26]. For daily data, the Gaussian innovation assumption seems to be appropriate, but the hourly time series present heavy tails and Gaussian assumption is inadequate. Figure 8, panel (a) shows the original time series () and its difference (), while panel (b) shows the standardized histogram of the differenced time series, which shows heavy-tailed behavior. We select just a small segment of the data because the whole time series has more complex structure, such as seasonality, thus a non-parametric approach could be more appropriate.


Figure 9 shows sample autocorrelation function (global), and partial autocorrelation function. Traditional models, like ARMA(1,1) and AR(4) seem to be appropriate, but the blocked smooth periodogram shows its slowly changed structure over the time.


To explore its local structure, we estimate ARMA(1,1) and AR(4) for 9 time blocks. Figures 10 and 11 present the smoothed estimated coefficients over time for both models. Both cases show that coefficients are approximately linear over time. Consequently, two models are proposed:
-
•
tvARMA(1,1) model with linear coefficients, , and .
-
•
tvAR(4) model with linear coefficients, , , , and .


After estimating both models, the residuals of tvARMA(1,1) are correlated, and we focus only on the tvAR(4). The parameter estimates are reported in Table 13. Figure 12 presents the residual analysis and the QQ-plot, box-plot and the histogram show that the distribution of error has heavy tail and the residuals are approximately white noise. Additionally, we estimated the skewness () and kurtosis () and carried out Shapiro-Wilk and Jarque-Bera tests, which rejected the null hypothesis of normality. Moreover, Figure 13 presents the variogram of the first difference of the wind data and the residuals from the tvAR(4) model. It is clear to observe that both of the variograms diverge.
Parameter | BWE | |||
---|---|---|---|---|
Estimate | s.e. | z-value | p-value | |
-1.5985 | 0.0768 | -20.8171 | 0.0000 | |
0.3305 | 0.1406 | 2.3508 | 0.0187 | |
0.9135 | 0.1373 | 6.6536 | 0.0000 | |
0.0207 | 0.2447 | 0.0847 | 0.9325 | |
-0.0585 | 0.1372 | -0.4266 | 0.6697 | |
-0.7153 | 0.2445 | -2.9254 | 0.0034 | |
-0.1316 | 0.0767 | -1.7158 | 0.0862 | |
0.5454 | 0.1405 | 3.8831 | 0.0001 | |
0.0077 | 0.0003 | 24.5452 | 0.0000 | |
0.0152 | 0.0007 | 21.6400 | 0.0000 |






Since the residuals present heavy tail, we propose a more flexible model, stable tvAR(4). We performed indirect estimation assuming known and unknown . In the first case, we assume and , which are obtained by MLE from the residuals of the BWE. Since estimation results are similar to the estimated model by assuming unknown , we present only results of the second model here.
The vector of parameters of IM is and the indirect inference was done by assuming symmetric -stable innovations. Table 14 reports indirect estimates assuming with their MC standard error with replications.
Parameter | Indirect estimate | Standard error |
---|---|---|
-1.5434 | 0.0251 | |
-0.0316 | 0.0426 | |
0.9036 | 0.0442 | |
0.1083 | 0.0764 | |
-0.2818 | 0.0437 | |
-0.2235 | 0.0752 | |
0.0639 | 0.0246 | |
0.1496 | 0.0412 | |
1.3875 | 0.0528 | |
0.0065 | 0.0005 | |
0.0033 | 0.0010 |
To evaluate the residual distribution with the stable distribution, [27] suggested using the stabilized probability plot (stabilized p-p plot), proposed by [28], instead of the QQ-plot because the last one is not suitable to evaluate heavy-tailed distribution. In QQ-plot, large fluctuation for the extreme values in case of the heavy-tailed distribution produce large standard errors in the tails. Let be an ordered random sample of size from the distribution . The stabilized p-p plot is defined as the plot of versus . In this way, the histogram and the stabilized p-p plot in figure 14 show that the stable distribution fits well the residuals.


Finally, we compare the Mean square error (MSE), Root mean square error (RMSE) and Mean absolute error (MAE) of tvARMA(1,1) and tvAR(4) using BWE and indirect estimates. Note that MSE and RMSE do not make sense theoretically if we assume stable tvAR(4). In Table 15, we observe that using BWE (assuming finite variance), MSE and RMSE are slightly lower, while using the indirect inference presents lower MAE.
Since the residual analysis indicates heavy-tails, stable tvAR(4) is a better model to describe the data. In this case, by assuming , which is far from , the simulation done in the previous section shows that the BWE is not appropriate. Even though MSE and RMSE are lower for BWE, they are not appropriate for stable process since they cannot be theoretically handled. Based on MAE, the indirect inference performs slightly better. Moreover, the interpretation of estimated coefficients of the model also changed, i.e. and are constant, while , and vary linearly.
Model | MSE | RMSE | MAE |
---|---|---|---|
tvARMA(1,1) | 0.000248 | 0.015739 | 0.009675 |
-stable tvARMA(1,1) | 0.000257 | 0.016028 | 0.009469 |
tvAR(4) | 0.000242 | 0.015542 | 0.009468 |
-stable tvAR(4) | 0.000256 | 0.015993 | 0.009094 |
7 Conclusion
In this paper, we studied stable locally stationary ARMA processes and presented their properties. In contrast to the locally stationary processes with finite variance, this process involves the infinite variance observed in different fields. We also proposed an indirect inference method for the process with parametric time-varying coefficients. We performed simulations for basic models with linear parametric coefficients for known and unknown . The results show that indirect inference appropriate. An application is also illustrated.
There are some limitations that still need to be solved in the future. Firstly, since the time-varying spectral representation does not exist, identifying the local structure using traditional methods (autocorrelation and partial autocorrelation) are an informal way to identify the time-varying structure. One possibility is the local version of the dependence measure called autocovariation [18]. Secondly, simulations should be done for more complex models and also consider the possibility of non-parametric models. Thirdly, the indirect inference is time-consuming but they are appropriate when heavy-tailed innovations are present. Simulations suggest that when is close to , Gaussian innovations can be assumed. Model selection is still an open question. Also, there is few work related to prediction.
Finally, we are involved in research about the locally stationary process with tempered stable innovations, which are similar to stable distribution in its center but their tails are lighter and moments of all orders are finite.
Funding
The authors are grateful to the support of a CNPq grant (141607/2017-3) and the University of Costa Rica (SWC), and a Fapesp grant 2018/04654-9 (PAM).
References
- [1] Dahlhaus R. Maximum likelihood estimation and model selection for locally stationary processes. Journal of Nonparametric Statistics. 1996;6(2-3):171–191.
- [2] Dahlhaus R. Fitting time series models to nonstationary processes. The Annals of Statistics. 1997;25(1):1–37.
- [3] Dahlhaus R. Locally stationary processes. In: Tata Subba Rao SSR, Rao C, editors. Time series analysis: Methods and applications. (Handbook of Statistics; Vol. 30). Elsevier; 2012. p. 351 – 413.
- [4] McCulloch JH. Simple consistent estimators of stable distribution parameters. Communications in Statistics - Simulation and Computation. 1986;15(4):1109–1136.
- [5] Koutrouvelis IA. An iterative procedure for the estimation of the parameters of stable laws. Communications in Statistics - Simulation and Computation. 1981;10(1):17–28.
- [6] Gourieroux C, Monfort A, Renault E. Indirect inference. Journal of Applied Econometrics. 1993;8(S1):S85–S118.
- [7] Gallant AR, Tauchen G. Which moments to match? Econometric Theory. 1996;12(4):657–681.
- [8] Lombardi MJ, Calzolari G. Indirect estimation of -stable distributions and processes. Econometrics Journal. 2008;11(1):193–208.
- [9] Sampaio JM, Morettin PA. Indirect estimation of randomized generalized autoregressive conditional heteroskedastic models. Journal of Statistical Computation and Simulation. 2015;85(13):2702–2717.
- [10] Sampaio JM, Morettin PA. Stable randomized generalized autoregressive conditional heteroskedastic models. Econometrics and Statistics. 2019;(to appear).
- [11] Calzolari G, Halbleib R, Parrini A. Estimating garch-type models with symmetric stable innovations: Indirect inference versus maximum likelihood. Computational Statistics & Data Analysis. 2014;76:158 – 171.
- [12] Calzolari G, Halbleib R. Estimating stable latent factor models by indirect inference. Journal of Econometrics. 2018;205(1):280 – 301.
- [13] Dahlhaus R, Polonik W. Empirical spectral processes for locally stationary time series. Bernoulli. 2009;15(1):1–39.
- [14] Samorodnitsky G, Taqqu M. Stable non-gaussian random processes: Stochastic models with infinite variance. Taylor & Francis; 1994. Stochastic Modeling Series.
- [15] Weron A, Weron R. Computer simulation of lévy -stable variables and processes. Berlin, Heidelberg: Springer Berlin Heidelberg; 1995. p. 379–392.
- [16] Embrechts P, Klüppelberg C, Mikosch T. Modelling extremal events for insurance and finance. Springer-Verlag Berlin Heidelberg; 1997.
- [17] Brockwell PJ, Davis RA. Time series: Theory and methods. Springer-Verlag New York; 1991.
- [18] Kokoszka PS, Taqqu MS. Infinite variance stable ARMA processes. Journal of Time Series Analysis. 1994;15(2):203–220.
- [19] Kokoszka PS, Taqqu MS. Fractional ARIMA with stable innovations. Stochastic Processes and their Applications. 1995;60(1):19 – 47.
- [20] Mikosch T, Gadrich T, Kluppelberg C, et al. Parameter estimation for ARMA models with infinite variance innovations. The Annals of Statistics. 1995;23(1):305–326.
- [21] Shelton Peiris M, Thavaneswaran A. On the properties of some nonstationary arma processes with infinite variance. International Journal of Modelling and Simulation. 2001;21(4):301–304.
- [22] Shelton Peiris M, Thavaneswaran A. Multivariate stable arma processes with time dependent coefficients. Metrika. 2001 Nov;54(2):131–138.
- [23] Miller K. Linear difference equations. W. A. Benjamin; 1968. Mathematics monograph series.
- [24] Van Bellegem S, von Sachs R. Forecasting economic time series with unconditional time-varying variance. International Journal of Forecasting. 2004;20(4):611–627.
- [25] Dahlhaus R, Giraitis L. On the optimal segment length for parameter estimates for locally stationary time series. Journal of Time Series Analysis. 1998;19(6):629–655.
- [26] Gonzales-Aparicio I, Zucker A, Carerri F, et al. EMHIRES dataset. part I: Wind power generation European Meteorological derived high resolution res generation time series for present and future scenarios. EUR 28171 EN; 10.2790/831549; 2016.
- [27] Nolan JP. Maximum likelihood estimation and diagnostics for stable distributions; 2002.
- [28] Michael JR. The stabilized probability plot. Biometrika. 1983;70(1):11–17.