Topics in multivariate covariance estimation and time series analysis.

Date

2013-12

Authors

Beeson, John D. (John David)

Access rights

Worldwide access.
Access changed 5/31/16.

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

In this dissertation we will discuss two topics relevant to statistical analysis. The first is a new test of linearity for a stationary time series, that extends the bootstrap methods of Berg et al. (2010) to goodness-of-fit (GoF) statistics specified in Harvill (1999) and Jahan and Harvill (2008). Berg's bootstrap method utilizes the statistics specified in Hinich (1982) in the framework of an autoregressive bootstrap procedure, however we show that by utilizing GoF methods, we can increase the power of the test. In Chapter three we discuss an alternative way of approaching the Friedman (1989) regularized discriminant method. Regularized discriminant analysis (RDA) is a well-known method of covariance regularization for the multivariate-normal based discriminant function. RDA generalizes the ideas of linear (LDA), quadratic (QDA), and mean-eigenvalue covariance regularization methods into one framework. The original idea and known extensions involve cross-validating in potentially high di- mensions, and can be highly computational. We propose using the Kullback-Leibler divergence as an optimization method to estimate a linear combination of class co- variance structures, which increases the accuracy of the RDA method, an limits the use of leave one out cross validation.

Description

Keywords

Statistics., Statistical hypothesis testing., Multivariate analysis

Citation