Course homepage for Optimisation Algorithms in Statistics II (PhD course 2021; 3.5 HEC)

Summary

Based on the topics discussed in first part of the course, we continue with deepening the theoretical basis for stochastic optimisation algorithms. Specifically, we discuss theory around stochastic gradient ascent (including momentum and adaptive step sizes), simulated annealing, and particle swarm optimisation. Theoretical results on convergence and speed will be discussed.

We will also discuss some important optimisation algorithms which were not included in the first part. Here, quasi-Newton methods and genetic algorithms will be discussed.

We will use again implementation with R. Examples from machine learning and optimal design will illustrate the methods.

Most welcome to the course!
Frank Miller, Department of Statistics, Stockholm University
frank.miller@stat.su.se

Topic 1: Stochastic gradient descent and quasi-Newton methods

Lectures: March 23; Time 10-12, 13-15 (registered participants will receive link to all lectures).

Reading:

Example code: no new example code, but note that there was an example code for steepest ascent in the first part of the course which might be a basis for Problem 1.3. Also a little program for log likelihood of logistic regression and its derivative was posted in the first part; it included the dataset used in Problem 1.1.

Assignment 1 (Deadline April 12).

Topic 2: Particle swarm optimisation and stochastic gradient descent with momentum

Lectures: April 13; Time 10-12, 13-15.

Reading:

  • Bonyadi MR, Michalewicz Z (2016). Stability analysis of the particle swarm optimization without stagnation assumption. IEEE transactions on evolutionary computation, 20(5):814-819.
  • Cleghorn CW, Engelbrecht AP (2018). Particle swarm stability: a theoretical extension using the non-stagnate distribution assumption. Swarm Intelligence, 12(1):1–22. An author version is available here.
  • Clerc M (2016). Chapter 8: Particle Swarms. In: Metaheuristics. (Siarry P ed.).

Example code: PSO_order1_stability.r

Assignment 2 (Deadline April 26).

Topic 3: Simulated annealing and genetic algorithms

Lectures: April 27; Time 10-12, 13-15.

Reading:

  • Givens GH, Hoeting JA (2013). Computational Statistics, 2nd edition. John Wiley & Sons, Inc., Hoboken, New Jersey. Chapter 3.1 to 3.4.

Example code: crit_HA3.r

Assignment 3 (Deadline May 11).


Prerequisites

The course is intended for Ph.D. students from Statistics or a related field (e.g. Mathematical Statistics, Engineering Science, Quantitative Finance, Computer Science). Previous knowledge in the following is expected:

  • familiarity with the content of the first part of the course,
  • familiarity with R (or another programming language with similar possibilities),
  • basic knowledge in multivariate calculus (e.g. from a Multivariate Statistics course),
  • statistical inference (e.g. from a Master's level course).

Examination and grading

The course is graded Pass or Fail. Examination is through three individual home assignments.

Course literature

We will not use a central course book. Several articles, book chapters and other learning resources will be recommended.

Course structure

The topics will be discussed during three online meetings with Zoom. Course participants will spend most of their study time by solving the problem sets for each topic on their own computers without supervision. The course will be held in March and April 2021.

Course schedule
  • Lecture 1: March 23; Time 10-12, 13-15 (online, Zoom)
  • Lecture 2: April 13; Time 10-12, 13-15 (online, Zoom)
  • Lecture 3: April 27; Time 10-12, 13-15 (online, Zoom)

Registration

To register for the course, please send an email to me (frank.miller at stat.su.se) until March 9, 2021. You are also welcome for any questions related to the course.