Self-normalization for heavy-tailed time series with long memory
Date
2007Author
McElroy, T.Politis, Dimitris Nicolas
ISSN
1017-0405Source
Statistica SinicaVolume
17Issue
1Pages
199-220Google Scholar check
Keyword(s):
Metadata
Show full item recordAbstract
Many time series data sets have heavy tails and/or long memory, both of which are well-known to greatly influence the rate of convergence of the sample mean. Typically time series analysts consider models with either heavy tails or long memory we consider both. The paper is essentially a theoretical case study that explores the growth rate of the sample mean for a particular heavy-tailed, long memory time series model. An exact rate of convergence, which displays the competition between memory and tail thickness in fostering sample mean growth, is obtained in our main theorem. An appropriate self-normalization is used to produce a studentized sample mean statistic, computable without prior knowledge of the tail and memory parameters. This paper presents a novel heavy-tailed time series model that also has long memory in the sense of sums of well-defined autocovariances we explicitly show the role that memory and tail thickness play in determining the sample mean's rate of growth, and we construct an appropriate studentization. Our model is a natural extension of long memory Gaussian models to data with infinite variance, and therefore pertains to a wide range of applications, including finance, insurance, and hydrology.