Machine learning is broadly about estimating functions using optimization algorithms. We can think of these as searching through a space of functions to find one that minimizes a measure of inaccuracy or loss.

Link | Type | Description |
---|---|---|

html pdf | Slides | Optimization and model complexity |

html | Notebook | Gradient descent |

Rmd | Notebook | Stochastic gradient descent |

Rmd | Notebook | Stepwise variable selection |

*To be updated*

- ISLR Chapter 6, Section 1 only.
- ISLR Chapter 7, Sections 1-3 and 6.
- MLstory Chapter 5 on
optimization, read the section on
*Stochastic gradient descent*and stop after the*SGD quick start guide*.

- MLstory Chapter 5 on optimization, the rest of the chapter (note that it contains some more advanced material).
- Wikipedia on Newtonâ€™s method and gradient descent (good for pictures and animations)

Slides for optimization video (PDF)

Slides for overfitting video (PDF)

Notebook for generalization (partially complete)

Notebook for optimization (partially complete)

Notebook for regularization (partially complete)

Text and figures are licensed under Creative Commons Attribution CC BY 4.0. The figures that have been reused from other sources don't fall under this license and can be recognized by a note in their caption: "Figure from ...".

For attribution, please cite this work as

Loftus (2021, Oct. 7). machine learning 4 data science: 5 Optimization and model complexity. Retrieved from http://ml4ds.com/weeks/05-optimization/

BibTeX citation

@misc{loftus20215, author = {Loftus, Joshua}, title = {machine learning 4 data science: 5 Optimization and model complexity}, url = {http://ml4ds.com/weeks/05-optimization/}, year = {2021} }