Khoshgoftaar, Taghi M.2019-03-262019-03-261982http://hdl.handle.net/10919/88723In the case of repetitive experiments of a similar type, where the parameters vary randomly from experiment to experiment, the Empirical Bayes method often leads to estimators which have smaller mean squared errors than the classical estimators. Suppose there is an unobservable random variable θ, where θ ~ G(θ), usually called a prior distribution. The Bayes estimator of θ cannot be obtained in general unless G(θ) is known. In the empirical Bayes method we do not assume that G(θ) is known, but the sequence of past estimates is used to estimate θ. This dissertation involves the empirical Bayes estimates of various time series parameters: The autoregressive model, moving average model, mixed autoregressive-moving average, regression with time series errors, regression with unobservable variables, serial correlation, multiple time series and spectral density function. In each case, empirical Bayes estimators are obtained using the asymptotic distributions of the usual estimators. By Monte Carlo simulation the empirical Bayes estimator of first order autoregressive parameter, ρ, was shown to have smaller mean squared errors than the conditional maximum likelihood estimator for 11 past experiences.vi, 100, [1] leavesapplication/pdfen-USIn CopyrightLD5655.V856 1982.K538Time-series analysisBayesian statistical decision theoryEmpirical Bayes methods in time series analysisDissertation