Spectral density based goodness-of-fit tests for time series models
Ημερομηνία
2000Source
Scandinavian Journal of StatisticsVolume
27Issue
1Pages
143-176Google Scholar check
Keyword(s):
Metadata
Εμφάνιση πλήρους εγγραφήςΕπιτομή
A new goodness-of-fit test for time series models is proposed. The test statistic is based on the distance between a kernel estimator of the ratio between the true and the hypothesized spectral density and the expected value of the estimator under the null. It provides a quantification of how well a parametric spectral density model fits the sample spectral density (periodogram). The asymptotic distribution of the statistic proposed is derived and its power properties are discussed. To improve upon the large sample (Gaussian) approximation of the distribution of the test statistic under the null, a bootstrap procedure is presented and justified theoretically. The finite sample performance of the test is investigated through a simulation experiment and applications to real data sets are given.