Abstract

A common task in statistical practice is the estimation of unknown parameters from available data. When proposing a model one could rely on unbiased estimators, meaning that the expected value of an estimator equals the true parameter being estimated. Historically, the absence of bias is often considered an attractive property of an estimator, because it allows intuitive interpretation of the results. However, a fixation on bias reduction increases the variance, which deteriorates the predictive performance of a model. It also occurs that the data do not provide enough information to produce well-behaved unbiased estimates. In these so called ill-conditioned problems, unbiased estimators will over-fit the data, or will be impossible to obtain. This work focusses on ill-conditioned problems in high-dimensional data analysis. Typical examples of high-dimensional data are signals and spectra in analytical chemistry, image data in computer vision ormicroarray data in biology. In all chapters penalized estimators, or often called shrinkage estimators, are used to obtain estimates. This means that the usual loss function is augmented with some type of penalty function that constrains or shrinks the coefficients in the model. The concept of ill-posedness is further introduced in the next section. A short introduction into penalized estimation is given in section 1.2. Section 1.3 provides a chapter by chapter introduction to the different topics within this thesis. In addition it provides some more details concerning the data and its technological aspects, not treated in the particular chapters.

,
P.H.C. Eilers (Paul) , P.J. van der Spek (Peter)
Erasmus University Rotterdam
hdl.handle.net/1765/51414
Erasmus MC: University Medical Center Rotterdam

de Rooi, J. (2013, October 24). Penalized Estimation in High-Dimensional Data Analysis. Retrieved from http://hdl.handle.net/1765/51414