Workshop Introduction to recursive machine learning algorithms

This workshop is a “hands-on” introduction to machine learning methods that “use data as it arrives”. The algorithms update their internal states with every new piece of information, and produce estimates of non-measured or derived magnitudes. Some of the algorithms might revise the information already stored to improve those estimations.

 

Date

13.01.2023, 20.01.2023, 27.01.2023 from 9:00 - 17:15 

Duration

3 days (~ 24 h)

Language

English

Costs

CHF 960

Requirements

English, linear algebra (basic), probability theory (basic), programming experience (beginner), Calculus (integrals and derivatives, basic)

Target group

Due to its requirements the workshop is suited for advanced students, master students, or professionals that need to understand these type of algorithms.

Place

Campus Rapperswil-Jona, Room 1.255

The workshop has an applied perspective, hence the following topics will be covered to the depth needed to understand the ideas and apply them to simple examples. For each topic there will be examples and interactive activities.

The workshop splits the main loop of the Kalman filter algorithm (shaded in red in the image below) in three sessions. Each sessions takes a full day of training.

Session structure. Shaded in red is the main loop of the Kalman filter. Each 1-day session covers a part of it.Source Kalman filter at Wikipedia.

The first session (S1 in the image) covers modeling and the prediction step of the algorithm. The second session (S2 in the image) takes care of the update state which involves concepts from probabilities and the Bayes rule. The last session (not shown in the figure) is about using the algorithm in different applications; some brought by the participants.

First day: modeling
Session 1.1: Overview. Filtering and smoothing.
Session 1.2: Iterated maps. Straight line as iterated map.
Session 1.3: Error propagation. Gaussian distribution.
Session 1.4: Stochastic modeling. Statistical dependence and causality.

Second day: inference
Session 2.1: Recap. Q&A.
Session 2.2: Conditional probability. Bayesian models. Kalman-Bucy filter.
Session 2.3: Data-driven models. Recursive regression. Delayed regression. Extreme learning machines.
Session 2.4: Parameter estimation.

Third day: applications
Session 3.1: Recap. Q&A.
Session 3.3: Ordinary differential equations (ODEs). Discretization of ODEs. 2D object tracking.
Session 3.2: Empyting water tank. Linearization. Inflow estimation.
Session 3.4: Interconnected tanks & participant contributed models

09:00 – 10:30 Session 1
10:30 – 10:45 Coffee break
10:45 – 12:30 Session 2
12:30 – 13:30 Lunch
13:30 – 14:45 Session 3
14:45 – 15:00 Coffee break
15:00 – 16:00 Session 4
16:00 – 17:00 Day wrap-up

Book for the workshop

Särkkä, S. (2013). Bayesian Filtering and Smoothing (Institute of Mathematical Statistics Textbooks). Cambridge: Cambridge University Press. doi:10.1017/CBO9781139344203
More details on the author’s webpage

 

Other readings

Simo Särkkä and Arno Solin (2019). Applied Stochastic Differential Equations. Cambridge University Press.

Strogatz, S. H. (2015). Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry, and Engineering. CRC press. ISBN 978-0813349107

Carl Edward Rasmussen and Christopher K. I. Williams (2006). Gaussian Processes for Machine Learning. The MIT Press. ISBN 0-262-18253-X. PDF

Lennart Ljung (2010), Perspectives on system identification, Annual Reviews in Control, Volume 34, Issue 1, Pages 1-12, ISSN 1367-5788

In 1960, R.E. Kalman published his famous paper describing a recursive solution to the discrete-data linear filtering problem.

Judea Pearl. 2019. The seven tools of causal inference, with reflections on machine learning. Commun. ACM 62, 3 (March 2019), 54–60. https://doi.org/10.1145/3241036

Peters, J., Janzing, D.,, Schölkopf, B. (2017). Elements of Causal Inference: Foundations and Learning Algorithms. Cambridge, MA: MIT Press. ISBN: 978-0-262-03731-0

Pillonetto, G., Chen, T., Chiuso, A., De Nicolao, G., & Ljung, L. (2022). Regularized system identification: Learning dynamic models from data. Springer.

Dr. Juan Pablo Carbajal

IET Institut für Energietechnik Wissenschaftlicher Mitarbeiter

+41 58 257 42 64 juanpablo.carbajal@ost.ch