Asymptotic Results for Model Robust Regression
Since the mid 1980's many statisticians have studied methods for combining parametric and nonparametric esimates to improve the quality of fits in a regression problem. Notably in 1987, Einsporn and Birch proposed the Model Robust Regression estimate (MRR1) in which estimates of the parametric function, ƒ, and the nonparametric function, 𝑔, were combined in a straightforward fashion via the use of a mixing parameter, λ. This technique was studied extensively at small samples and was shown to be quite effective at modeling various unusual functions. In 1995, Mays and Birch developed the MRR2 estimate as an alternative to MRR1. This model involved first forming the parametric fit to the data, and then adding in an estimate of 𝑔 according to the lack of fit demonstrated by the error terms. Using small samples, they illustrated the superiority of MRR2 to MRR1 in most situations. In this dissertation we have developed asymptotic convergence rates for both MRR1 and MRR2 in OLS and GLS (maximum likelihood) settings. In many of these settings, it is demonstrated that the user of MRR1 or MRR2 achieves the best convergence rates available regardless of whether or not the model is properly specified. This is the "Golden Result of Model Robust Regression". It turns out that the selection of the mixing parameter is paramount in determining whether or not this result is attained.