dc.contributor.author | Politis, Dimitris Nicolas | en |
dc.contributor.author | Poulis, S. | en |
dc.contributor.editor | Politis, Dimitris Nicolas | en |
dc.contributor.editor | Akritas, Michael G. | en |
dc.contributor.editor | Lahiri S.N. | en |
dc.creator | Politis, Dimitris Nicolas | en |
dc.creator | Poulis, S. | en |
dc.date.accessioned | 2019-12-02T10:37:54Z | |
dc.date.available | 2019-12-02T10:37:54Z | |
dc.date.issued | 2014 | |
dc.identifier.isbn | 978-1-4939-0568-3 | |
dc.identifier.uri | http://gnosis.library.ucy.ac.cy/handle/7/57528 | |
dc.description.abstract | In linear regression with heteroscedastic errors, the Generalized Least Squares (GLS) estimator is optimal, i.e., it is the Best Linear Unbiased Estimator (BLUE). The Ordinary Least Squares (OLS) estimator is suboptimal but still valid, i.e., unbiased and consistent. White, in his seminal paper (White, Econometrica 48:817–838, 1980) used the OLS residuals in order to obtain an estimate of the standard error of the OLS estimator under an unknown structure of the underlying heteroscedasticity. The GLS estimator similarly depends on the unknown heteroscedasticity, and is thus intractable. In this paper, we introduce two different approximations to the optimal GLS estimator | en |
dc.description.abstract | the starting point for both approaches is in the spirit of White’s correction, i.e., using the OLS residuals to get a rough estimate of the underlying heteroscedasticity. We show how the new estimators can benefit from the Wild Bootstrap both in terms of optimising them, and in terms of providing valid standard errors for them despite their complicated construction. The performance of the new estimators is compared via simulations to the OLS and to the exact (but intractable) GLS. © Springer Science+Business Media New York 2014. | en |
dc.publisher | Springer New York LLC | en |
dc.source | Springer Proceedings in Mathematics and Statistics | en |
dc.source | 1st Conference of the International Society of Nonparametric Statistics, ISNPS 2012 | en |
dc.source.uri | https://www.scopus.com/inward/record.uri?eid=2-s2.0-84919962012&doi=10.1007%2f978-1-4939-0569-0_26&partnerID=40&md5=e4cf3c4b52a51276d53ad4a31255dae4 | |
dc.subject | Errors | en |
dc.subject | Statistics | en |
dc.subject | Heteroscedasticity | en |
dc.subject | Best linear unbiased estimator | en |
dc.subject | BLUE | en |
dc.subject | Generalized least square | en |
dc.subject | Heteroscedastic | en |
dc.subject | Least squares estimation | en |
dc.subject | Minimum variance | en |
dc.subject | Ordinary least squares | en |
dc.title | Heteroskedastic linear regression: Steps towards adaptivity, efficiency, and robustness | en |
dc.type | info:eu-repo/semantics/conferenceObject | |
dc.identifier.doi | 10.1007/978-1-4939-0569-0_26 | |
dc.description.volume | 74 | |
dc.description.startingpage | 283 | |
dc.description.endingpage | 297 | |
dc.author.faculty | Σχολή Θετικών και Εφαρμοσμένων Επιστημών / Faculty of Pure and Applied Sciences | |
dc.author.department | Τμήμα Μαθηματικών και Στατιστικής / Department of Mathematics and Statistics | |
dc.type.uhtype | Conference Object | en |
dc.description.notes | <p>Sponsors: Springer Science and Business Media | en |
dc.description.notes | The Bernoulli Society for Mathematical Statistics and Probability | en |
dc.description.notes | The Institute of Mathematical Statistics (IMS) | en |
dc.description.notes | The International Statistical Institute (ISI) | en |
dc.description.notes | The Nonparametric Statistics Section of the American Statistical Association (ASA) | en |
dc.description.notes | Conference code: 111829</p> | en |