Harvill, Jane L.Beeson, John D. (John David)2014-01-282014-01-282013-122014-01-28http://hdl.handle.net/2104/8896In this dissertation we will discuss two topics relevant to statistical analysis. The first is a new test of linearity for a stationary time series, that extends the bootstrap methods of Berg et al. (2010) to goodness-of-fit (GoF) statistics specified in Harvill (1999) and Jahan and Harvill (2008). Berg's bootstrap method utilizes the statistics specified in Hinich (1982) in the framework of an autoregressive bootstrap procedure, however we show that by utilizing GoF methods, we can increase the power of the test. In Chapter three we discuss an alternative way of approaching the Friedman (1989) regularized discriminant method. Regularized discriminant analysis (RDA) is a well-known method of covariance regularization for the multivariate-normal based discriminant function. RDA generalizes the ideas of linear (LDA), quadratic (QDA), and mean-eigenvalue covariance regularization methods into one framework. The original idea and known extensions involve cross-validating in potentially high di- mensions, and can be highly computational. We propose using the Kullback-Leibler divergence as an optimization method to estimate a linear combination of class co- variance structures, which increases the accuracy of the RDA method, an limits the use of leave one out cross validation.en-USBaylor University theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission. Contact librarywebmaster@baylor.edu for inquiries about permission.Statistics.Statistical hypothesis testing.Multivariate analysisTopics in multivariate covariance estimation and time series analysis.ThesisWorldwide access.Access changed 5/31/16.