Coop UQAM | Coopsco

Créer mon profil | Mot de passe oublié?

Magasiner par secteur

Matériel obligatoire et recommandé

Voir les groupes
Devenir membre

Nos partenaires

UQAM
ESG UQAM
Réseau ESG UQAM
Bureau des diplômés
Centre sportif
Citadins
Service de la formation universitaire en région
Université à distance
Société de développement des entreprises culturelles - SODEC
L'institut du tourisme et de l'hotellerie - ITHQ
Pour le rayonnement du livre canadien
Presses de l'Université du Québec
Auteurs UQAM : Campagne permanente de promotion des auteures et auteurs UQAM
Fondation de l'UQAM
Écoles d'été en langues de l'UQAM
Canal savoir
L'économie sociale, j'achète
Millénium Micro



Recherche avancée...

Statistical Learning with Sparsity : The Lasso and Generalizations

Statistical Learning with Sparsity : The Lasso and Generalizations

Hastie, Trevor \ Tibshirani, Robert /Wainwright, Martin


Éditeur : CRC PRESS
ISBN papier: 9781498712163
Parution : 2018
Code produit : 1363111
Catégorisation : Livres / Science / Mathématique / Mathématiques

Formats disponibles

Format Qté. disp. Prix* Commander
Livre papier 1 Prix membre : 147,00 $
Prix non-membre : 147,00 $
x

*Les prix sont en dollars canadien. Taxes et frais de livraison en sus.




Description

Discover New Methods for Dealing with High-Dimensional Data A sparse statistical model has only a small number of nonzero parameters or weights; therefore, it is much easier to estimate and interpret than a dense model. Statistical Learning with Sparsity: The Lasso and Generalizationspresents methods that exploit sparsity to help recover the underlying signal in a set of data. Top experts in this rapidly evolving field, the authors describe the lasso for linear regression and a simple coordinate descent algorithm for its computation. They discuss the application of '1penalties to generalized linear models and support vector machines, cover generalized penalties such as the elastic net and group lasso, and review numerical methods for optimization. They also present statistical inference methods for fitted (lasso) models, including the bootstrap, Bayesian methods, and recently developed approaches. In addition, the book examines matrix decomposition, sparse multivariate analysis, graphical models, and compressed sensing. It concludes with a survey of theoretical results for the lasso. In this age of big data, the number of features measured on a person or object can be large and might be larger than the number of observations. This book shows how the sparsity assumption allows us to tackle these problems and extract useful and reproducible patterns from big datasets. Data analysts, computer scientists, and theorists will appreciate this thorough and up-to-date treatment of sparse statistical modeling.