An Introduction to Statistical Learning: with Applications in R

Front Cover
Springer Nature, Jul 29, 2021 - Mathematics - 607 pages

An Introduction to Statistical Learning provides an accessible overview of the field of statistical learning, an essential toolset for making sense of the vast and complex data sets that have emerged in fields ranging from biology to finance to marketing to astrophysics in the past twenty years. This book presents some of the most important modeling and prediction techniques, along with relevant applications. Topics include linear regression, classification, resampling methods, shrinkage approaches, tree-based methods, support vector machines, clustering, deep learning, survival analysis, multiple testing, and more. Color graphics and real-world examples are used to illustrate the methods presented. Since the goal of this textbook is to facilitate the use of these statistical learning techniques by practitioners in science, industry, and other fields, each chapter contains a tutorial on implementing the analyses and methods presented in R, an extremely popular open source statistical software platform.

Two of the authors co-wrote The Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2nd edition 2009), a popular reference book for statistics and machine learning researchers. An Introduction to Statistical Learning covers many of the same topics, but at a level accessible to a much broader audience. This book is targeted at statisticians and non-statisticians alike who wish to use cutting-edge statistical learning techniques to analyze their data. The text assumes only a previous course in linear regression and no knowledge of matrix algebra.

This Second Edition features new chapters on deep learning, survival analysis, and multiple testing, as well as expanded treatments of naïve Bayes, generalized linear models, Bayesian additive regression trees, and matrix completion. R code has been updated throughout to ensure compatibility.

 

Contents

1 Introduction
1
2 Statistical Learning
15
3 Linear Regression
59
4 Classification
129
5 Resampling Methods
196
6 Linear Model Selection and Regularization
225
7 Moving Beyond Linearity
289
8 TreeBased Methods
327
9 Support Vector Machines
366
10 Deep Learning
403
11 Survival Analysis and Censored Data
461
12 Unsupervised Learning
497
13 Multiple Testing
553
Index
596
Copyright

Other editions - View all

Common terms and phrases

About the author (2021)

Gareth James is a professor of data sciences and operations, and the E. Morgan Stanley Chair in Business Administration, at the University of Southern California. He has published an extensive body of methodological work in the domain of statistical learning with particular emphasis on high-dimensional and functional data. The conceptual framework for this book grew out of his MBA elective courses in this area.

Daniela Witten is a professor of statistics and biostatistics, and the Dorothy Gilford Endowed Chair, at the University of Washington. Her research focuses largely on statistical machine learning techniques for the analysis of complex, messy, and large-scale data, with an emphasis on unsupervised learning.

Trevor Hastie and Robert Tibshirani are professors of statistics at Stanford University, and are co-authors of the successful textbook Elements of Statistical Learning. Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie co-developed much of the statistical modeling software and environment in R/S-PLUS and invented principal curves and surfaces. Tibshirani proposed the lasso and is co-author of the very successful An Introduction to the Bootstrap.

Bibliographic information