English version of this page

Exercises

Here, weekly exercises will be given. These will normally be published after the lectures on thursdays. 

Note that the link to book website given in the book do not work any more, new link is https://hastie.su.domains/ElemStatLearn/

A link to the extra exercises is here.

A link to some solution proposals.

  • Exercises for May 16
    • Exercise 10.4 from the textbook (you may use the code from the lecture on 12.05.2022)
    • Exercise 5 from chapter 8 of the new edition of ISLR book
    • Exam STK2100 Spring 2019: Exercise 1 (but skip the question about lift curve)
  •  Exercises for May 2
    • Exercise 6.2 from the textbook
    • Take the script ozone_kernel.R and modify the commands to perform cross-validation for evaluation of the local linear regression model. Use this cross-validation procedure to find the optimal span parameter.
    • Exam STK2100 June 15 2021, exercises 1 and 2 (this might be postponed to next week)
  • Exercises for April 25
    • Exercise 7.9, 7.10 from the textbook.
    • Exercise 3.4 of the new edition of ISLR book
    • For the Abolone data from the lecture, estimate the RMSE using 
      • all data both for training and testing
      • separate training and test sets with 50% each
      • leave-one-out crossvalidation 
      • 5-fold crossvalidation
      • 20-fold crossvalidation
      • 100-fold crossvalidation
      • summarize your results
  • Exercises for April 4
    • 1.    A small data-set torch example
      • a) Implement a logistic regression in torch with SGD/GD as an optmizer. The optimizer should be written by hand.
      • b) Tune the learning rate and its cooling schedule of the optimizers from a) and apply the model to Default data set from library(ISLR).
      • c) Draw the obtained decision boundary
      • d) Add regularization (weight decay) and  play around with its strength
      • e) Draw the obtained decision boundaries
      • d) Now, implement a neural network with two hidden layers of sizes 8 and 4 and repeat a)-e) for this model.
    • 2.  Implement https://en.wikipedia.org/wiki/LeNet
      • a) Implement LeNet with weight decay in torch
      • b) Implement SGD optimizer by hand 
      • c) Train the model on MNIST train set
      • d) Evaluate accuracy on MNIST test set
      • e) Try Adam and SGD optimizers from torch library and repeat b)-d) for them
      • f) Repeat a)-e) for FMNIST and KMNIST datasets available in torchvision library
  • Exercises for March 28
    • Exercises 12 and 13  from extra exercise set
    • Exercises 11.2 and 11.3 from textbook
  • Exercises for March 14:
    • From ISLR book: Exercise 7.9.10 (The GAM model here can be a model similar to the ones used in the bone.R script)
    • Go through the sahear.R script. See if it is possible to simplify the models further by considering the non-linear functions of nbp and age.
    • Extra exercises 9 and 6
    • Challenge: Extra exercise 7
  • Exercises for March 7:
    • From ISLR book: Exercise 7.9.1, 7.9.9
    • From textbook: Exercise 5.4
    • Exam Exam STK2100 2018: Problem 2
    • Challenge: From textbook: Exercise 5.7
  • Exercises for February 28:
    • Implement Rosenblatt's Perceptron Learning Algorithm and apply it to the credit default data with income and balance as x1 and x2. Choose the step size of SGD to be a(t)=10*(0.9)^t.
    • Derive the Hessian for the log-likelihood of the logistic regression model.
    • From the new edition of ISLR book: Exercise 4.10 from Section 4
    • Challenge: Exercises 6 from extra exercise set
  • Exercises for February 21/25:
  • Exercises for February 14:
    • Exercises 3.5 and 3.29 from the textbook
    • From the ISLR book: Exercises 3.14, 6.1, 6.4, 6.10
    • Challenge: Exercise 2.9 from the textbook
  • Exercises for February 7:
    • From the textbook: Exercises 3.2
    • From the ISLR book (James et al): Exercise 3.4, 3.6, 3.9
    • Extra exercise 5
  • Exercises for January 31:
    • From the ISLR book: Exercises 3.3, 3.7, 3.5, 3.8 (section 3.7)
    • Extra exercises 1 and 4, see link above. Note that part of these exercises have already been discussed in the lectures, but a good exercise to try it by yourself!
  • Exercises for January 24:
    • Exercise 2.7 from the textbook.
    • Exercises 1, 2 and 8 from chapter 2 of the ISLR book (James et al), see webpage for the book  for downloading data.
      • Regarding exercise 8: The easiest way of getting hold of the data is to install the ISLR library (trough the command install.packages("ISLR")), make the library available (through library("ISLR")), and then make the data available through data(College)
Publisert 19. jan. 2022 16:19 - Sist endret 11. des. 2023 14:16