This repository contains Jupyter notebooks implementing the algorithms found in the book and summary of the textbook.
-
Chapter 2
- 2.3 Least Squares and Nearest Neighbors
- 2.4 Statistical Decision Theory
- 2.5 Local Methods in High Dimensions
- 2.6 Statistical Models, Supervised Learning and Function Approximation
- 2.7 Structured Regression Models
- 2.8 Classes of Restricted Estimators
- 2.9 Model Selection and the Bias-Variance Tradeoff
-
Chapter 3
- 3.1 Introduction
- 3.2 Linear Regression Models and Least Squares
- 3.2.1 Example Prostate Cancer
- 3.2.2 The Gauss–Markov Theorem
- 3.2.3 Multiple Regression From Simple Univariate Regression
- 3.2.4 Multiple Outputs
- 3.3 Subset Selection
- 3.4 Shrinkage Methods
- 3.4.1 Ridge Regression
- 3.4.2 The Lasso
- TODO: 3.4.3 Discussion: Subset Selection, Ridge Regression and the Lasso
- 3.4.4 Least Angle Regression
- 3.5 Methods Using Derived Input Directions
-
Chapter 4
- 4.1 Introduction
- 4.2 Linear Regression of an Indicator Matrix
- 4.3 Linear Discriminant Analysis
- 4.3.1 Regularized Discriminant Analysis
- 4.3.2 Computations for LDA
- 4.3.3 Reduced-Rank Linear Discriminant Analysis
- 4.4 Logistic Regression
- 4.4.1 Fitting Logistic Regression Models
- 4.4.2 Example: South African Heart Disease
- 4.4.3 Quadratic Approximations and Inference
- 4.4.4 L1 Regularized Logistic Regression
-
Chapter 11