Estimation and testing under sparsity: école d'été de probabilités de Saint-Flour XLV - 2015 / Sara van de Geer

Collectivité principale: école d'été de probabilités de Saint-Flour, 45, Saint-Flour (2015) Co-auteur: Geer, Sara A. van de (1958-) - AuteurType de document: CongrèsCollection: Lecture notes in mathematics, école d'été de probabilités de Saint-Flour ; 2159Langue: anglaisPays: SwisseÉditeur: Cham : Springer, 2016Description: 1 vol. (XIII-274 p.) : fig. ; 24 cm ISBN: 9783319327730 ; br. Résumé: The book deals with models of high-dimensional data, that is models where the number of parameters to be estimated is larger than the number of observations available for parameter estimation. Nowadays, such models are very important, as due to the significant technological advances large volumes of observations can, and are often recorded (through internet, cameras, smartphones, etc.). In addition, the parameter set may be sparse, that is the number of really relevant parameters is smaller than the number of the observations, but no one knows how many they are beforehand. An important technique when dealing with parameter estimation in such high-dimensional models is the Lasso method. The book uses this method as the starting point and the basis for the understanding of other methods also presented and discussed, such as those inducing structured sparsity or low rank or those based on more general loss functions. The book provides several examples and illustrations of the methods presented and discussed, while each of its 17 chapters ends with a problem section. Thus, it can be used as textbook for students mainly at postgraduate level. (zbMath).Bibliographie: Bibliogr. p. 267-269. Index. Sujets MSC: 60-02 Probability theory and stochastic processes -- Research exposition (monographs, survey articles)
60F05 Probability theory and stochastic processes -- Limit theorems -- Central limit and other weak theorems
60F17 Probability theory and stochastic processes -- Limit theorems -- Functional limit theorems; invariance principles
62J07 Statistics -- Linear inference, regression -- Ridge regression; shrinkage estimators
62J12 Statistics -- Linear inference, regression -- Generalized linear models
En-ligne: Springerlink - résumé | zbMath | MSN
Location Call Number Status Date Due
Salle S 12441-01 / Ecole STF (Browse Shelf) Available

Bibliogr. p. 267-269. Index

The book deals with models of high-dimensional data, that is models where the number of parameters to be estimated is larger than the number of observations available for parameter estimation. Nowadays, such models are very important, as due to the significant technological advances large volumes of observations can, and are often recorded (through internet, cameras, smartphones, etc.). In addition, the parameter set may be sparse, that is the number of really relevant parameters is smaller than the number of the observations, but no one knows how many they are beforehand. An important technique when dealing with parameter estimation in such high-dimensional models is the Lasso method. The book uses this method as the starting point and the basis for the understanding of other methods also presented and discussed, such as those inducing structured sparsity or low rank or those based on more general loss functions. The book provides several examples and illustrations of the methods presented and discussed, while each of its 17 chapters ends with a problem section. Thus, it can be used as textbook for students mainly at postgraduate level. (zbMath)

There are no comments for this item.

Log in to your account to post a comment.
Languages: English | Français | |