Weekly plans for week 37 and update from last week

Welcome back to a new and exciting ML week.

Last week we discussed Ridge and Lasso regression using linear algebra and the singular value decomposition of a matrix as well as linking everything with a statistical interpretation. There we noted that, using the maximum likelihood estimation ansatz, we could derive the ordinary least square equations. Using Bayes' theorem we were able to get an alternative way of deriving the Ridge and Lasso equations. 

This week we discuss resampling techniques and in particular we will focus on two widely methods used to obtain a better estimation of expected values, the so-called Bootstrap method and the cross-validation method.  We will also include a discussion of the bias-variance tradeoff and other statistical quantities like the estimation of confidence intervals. These are all topics of relevance for project 1.

 

By the end of this week you should thus have all theoretical elements needed for project 1. We will also use parts of the lectures in the coming weeks to discuss project 1 and related topics.

The schedule and reading suggestions for this week are thus

 

  • Lab Wednesday: work on project 1
  • Lecture Thursday: Resampling methods, cross-validation and Bootstrap
  • Lecture Friday: More on Resampling methods and summary of linear regression
  • Recommended Reading:
    • Lectures on Resampling methods for week 37 at https://compphysics.github.io/MachineLearning/doc/web/course.html.
    • Bishop 1.3 (cross-validation) and 3.2 (bias-variance tradeoff)
    • Hastie et al Chapter 7, here we recommend 7.1-7.5 and 7.10 (cross-validation) and 7.11 (bootstrap). This chapter is better than Bishop's on these topics. Goodfellow et al discuss some of these topics in sections 5.2-5.5.

Best wishes to you all and see you at the labs.

 

 

 

 

Publisert 14. sep. 2021 23:36 - Sist endret 14. sep. 2021 23:36