Weekly plans for week 38 and update from last week

Hi all FYS-STK folks, we hope all is well and that you got well started with project 1.
This week at the lab we keep working on project 1. We will keep the two digital labs as well as long as there is an interest. 

Last week we discussed resampling methods like Cross-validation, Bootstrap and jackknife, as well as discussing possible ways to understand what Ridge and Lasso regression mean. The slides from last week at https://compphysics.github.io/MachineLearning/doc/web/course.html (scroll down to week 37) cover what was discussed during the lectures. The link to the videos from the lectures can also be found in the weekly slides, as well as at lecture info on the official UiO website and the schedule link for the jupyter-book at 
https://compphysics.github.io/MachineLearning/doc/LectureNotes/_build/html/schedule.html

This week, the plan is to summarize our discussion on linear regression and what we have done these first weeks and discuss project 1. For the latter, we will discuss in particular examples (see the slides for week 38) on how to relate our own methods with those of Scikit-Learn (and scaling/normalizing or not scaling/normalizing is one of the issues) . We will also discuss other technicalities concerning project 1. The examples you find in the slides for week 38 (before the logistic regression material) can also be discussed during the lab sessions.

After the summary (first lecture Thursday), we move into classification problems and our first method is also a very deterministic one, logistic regression. It will also serve as a way to introduce optimization methods like the family of gradient descent methods and serve as an input to our discussion of neural networks.

The reading recommendations are:

See lecture notes for week 38 at https://compphysics.github.io/MachineLearning/doc/web/course.html.

Bishop 4.1, 4.2 and 4.3. Not all the material is relevant or will be covered. Section 4.3 is the most relevant, but 4.1 and 4.2 give interesting background readings for logistic regression

Hastie et al 4.1, 4.2 and 4.3 on logistic regression

For a good discussion on gradient methods, see Goodfellow et al section 4.3-4.5 and chapter 8. We will come back to the latter chapter in our discussion of Neural networks as well.

Best wishes to you all,
Morten et al

Publisert 21. sep. 2021 23:50 - Sist endret 21. sep. 2021 23:50