Australasia's Biggest Online Store

We won't be beaten by anyone. Guaranteed

Statistical Learning with Sparsity
By

Rating
Product Description
Product Details

Table of Contents

Introduction The Lasso for Linear Models Introduction The Lasso Estimator Cross-Validation and Inference Computation of the Lasso Solution Degrees of Freedom Uniqueness of the Lasso Solutions A Glimpse at the Theory The Nonnegative Garrote q Penalties and Bayes Estimates Some Perspective Generalized Linear Models Introduction Logistic Regression Multiclass Logistic Regression Log-Linear Models and the Poisson GLM Cox Proportional Hazards Models Support Vector Machines Computational Details and glmnet Generalizations of the Lasso Penalty Introduction The Elastic Net The Group Lasso Sparse Additive Models and the Group Lasso The Fused Lasso Nonconvex Penalties Optimization Methods Introduction Convex Optimality Conditions Gradient Descent Coordinate Descent A Simulation Study Least Angle Regression Alternating Direction Method of Multipliers Minorization-Maximization Algorithms Biconvexity and Alternating Minimization Screening Rules Statistical Inference The Bayesian Lasso The Bootstrap Post-Selection Inference for the Lasso Inference via a Debiased Lasso Other Proposals for Post-Selection Inference Matrix Decompositions, Approximations, and Completion Introduction The Singular Value Decomposition Missing Data and Matrix Completion Reduced-Rank Regression A General Matrix Regression Framework Penalized Matrix Decomposition Additive Matrix Decomposition Sparse Multivariate Methods Introduction Sparse Principal Components Analysis Sparse Canonical Correlation Analysis Sparse Linear Discriminant Analysis Sparse Clustering Graphs and Model Selection Introduction Basics of Graphical Models Graph Selection via Penalized Likelihood Graph Selection via Conditional Inference Graphical Models with Hidden Variables Signal Approximation and Compressed Sensing Introduction Signals and Sparse Representations Random Projection and Approximation Equivalence between 0 and 1 Recovery Theoretical Results for the Lasso Introduction Bounds on Lasso 2-error Bounds on Prediction Error Support Recovery in Linear Regression Beyond the Basic Lasso Bibliography Author Index Index Bibliographic Notes and Exercises appear at the end of each chapter.

About the Author

Trevor Hastie is the John A. Overdeck Professor of Statistics at Stanford University. Prior to joining Stanford University, Professor Hastie worked at AT&T Bell Laboratories, where he helped develop the statistical modeling environment popular in the R computing system. Professor Hastie is known for his research in applied statistics, particularly in the fields of data mining, bioinformatics, and machine learning. He has published five books and over 180 research articles in these areas. In 2014, he received the Emanuel and Carol Parzen Prize for Statistical Innovation. He earned a PhD from Stanford University. Robert Tibshirani is a professor in the Departments of Statistics and Health Research and Policy at Stanford University. He has authored five books, co-authored three books, and published over 200 research articles. He has made important contributions to the analysis of complex datasets, including the lasso and significance analysis of microarrays (SAM). He also co-authored the first study that linked cell phone usage with car accidents, a widely cited article that has played a role in the introduction of legislation that restricts the use of phones while driving. Professor Tibshirani was a recipient of the prestigious COPSS Presidents' Award in 1996 and was elected to the National Academy of Sciences in 2012. Martin Wainwright is a professor in the Department of Statistics and the Department of Electrical Engineering and Computer Sciences at the University of California, Berkeley. Professor Wainwright is known for theoretical and methodological research at the interface between statistics and computation, with particular emphasis on high-dimensional statistics, machine learning, graphical models, and information theory. He has published over 80 papers and one book in these areas, received the COPSS Presidents' Award in 2014, and was a section lecturer at the International Congress of Mathematicians in 2014. He received PhD in EECS from the Massachusetts Institute of Technology (MIT).

Reviews

"The authors study and analyze methods using the sparsity property of some statistical models in order to recover the underlying signal in a dataset. They focus on the Lasso technique as an alternative to the standard least-squares method." -Zentralblatt MATH 1319 "The book includes all the major branches of statistical learning. For each topic, the authors first give a concise introduction of the basic problem, evaluate conventional methods, pointing out their deficiencies, and then introduce a method based on sparsity. Thus, the book has the potential to be the standard textbook on the topic." -Anand Panangadan, California State University, Fullerton "It always first discusses regularized models based on equations, followed by example applications, before ending with a bibliography section detailing the historical development of the given method. Software recommendations (mostly open source R packages) are typically provided either in the main part or bibliography section of each chapter. And each chapter concludes with a set of selected exercises meant to deepen the gained knowledge on the given subject, which of course is of great help for teachers of statistics. For these reasons, we congratulate the authors of Statistical Learning with Sparsity and recommend the book to all statistically-inclined readers from intermediate to expert levels. In addition, it is worth pointing out that even for non-statisticians, the book is able to demonstrate,based on numerous real-world examples, the power of regularization."-Ivan Kondofersky and Fabian J. Theis, Institute for Computational Biology

Ask a Question About this Product More...
Write your question below:
Look for similar items by category
How Fishpond Works
Fishpond works with suppliers all over the world to bring you a huge selection of products, really great prices, and delivery included on over 25 million products that we sell. We do our best every day to make Fishpond an awesome place for customers to shop and get what they want — all at the best prices online.
Webmasters, Bloggers & Website Owners
You can earn a 5% commission by selling Statistical Learning with Sparsity: The Lasso and Generalizations (Chapman & Hall/CRC Monographs on Statistics & Applied Probability) on your website. It's easy to get started - we will give you example code. After you're set-up, your website can earn you money while you work, play or even sleep! You should start right now!
Authors / Publishers
Are you the Author or Publisher of a book? Or the manufacturer of one of the millions of products that we sell. You can improve sales and grow your revenue by submitting additional information on this title. The better the information we have about a product, the more we will sell!
Item ships from and is sold by Fishpond World Ltd.
Back to top