UMVU estimation of phase and group delay with small samples

TR Number
Journal Title
Journal ISSN
Volume Title
Virginia Polytechnic Institute and State University

Group delay between two univariate time series is a measure, in units of time, of how one series leads or lags the other at specific frequencies. The only published method of estimating group delay is Hannan and Thomson (1973); however, their method is highly asymptotic and does not allow inference to be performed on the group delay parameter in finite samples. In fact, spectral analysis in general does not allow for small sample inference which is a difficulty with the frequency domain approach to time series analysis. The reason that no statistical inference may be performed in small samples is the fact that distribution theory for spectral estimates is highly asymptotic and one can never be certain in a particular application what finite sample size is required to justify the asymptotic result.

In the dissertation the asymptotic distribution theory is circumvented by use of the Box-Cox power transformation on the observed sample phase function. Once transformed, it is assumed that the sample phase is approximately normally distributed and the relationship between phase and frequency is modelled by a simple linear regression model. ln order to estimate group delay it is necessary to inversely transform the predicted values to the original scale of measurement and this is done by expanding the inverse Box-Cox transformation function in a Taylor Series expansion. The group delay estimates are generated by using the derivative of the Taylor Series expansion for phase. The UMVUE property comes from the fact that the Taylor Series expansions are functions of complete, sufficient statistics from the transformed domain and the Lehmann-Scheffe' result (1950) is invoked to justify the UMVUE property.