Session

for Student

Close

Advanced Machine Learning Course

2023.10.16 - 2023.12.18

Times:10 sessions

Format:Face-to-face

Presented:AIC

(1) Outline of the course

The course will cover more advanced topics on classical machine learning and deep learning. In this course, we will first cover linear algebra and probability statistics as a preliminary step, and then explain representative methods and theories of machine learning, focusing on mathematical theories. Basically, the class will take a lecture-style format with materials distributed, and will consist of a 60-minute lecture on theory + 30 minutes of implementation in python, or exercises on the mathematical part. The implementation will be done by full-scratch so that the students will be able to grasp the contents of the class. The course will also delve deeper into the theoretical aspects of the content explained in “Introduction to Machine Learning,” so it is desirable for students to have mastered the above-mentioned workshop. The objective of this course is to provide participants with knowledge that is essential for conducting research related to machine learning, as well as knowledge that can be useful in gaining exposure to cutting-edge methods. The content of the course will be changed according to the level of proficiency of the participants.

(2) Contents of each session

Session 1: Review of linear algebra 10/16

The purpose of this lecture is to give a brief explanation of machine learning and to give students a general idea of the overall flow of this course.

In addition, a light lecture will be given on linear algebra, which is a very important area in this and other engineering fields. Specifically, eigenvalues, eigenvectors, symmetric and transposed matrices, and diagonalization will be covered.

The course will be equivalent to Appendix C of the upper volume of PRML.

Session 2: Polynomial Regression Analysis 10/23

First, curve fitting will be explained and then regression analysis of polynomial functions will be presented. This lecture is related to Chapters 1 and 3 of PRML.

Session 3: Fundamentals of Probability Statistics (Non-Bayesian Probability) 10/30

This lecture explains probability and statistics, which are the main mathematical fields required for machine learning. Extension from discrete probability to continuous probability, which was covered before college, and non-Bayesian probability based on trial and experiment will be briefly explained.

Specifically, the course aims at mastering concepts such as random variables, probability distributions, conditional probability, probability density functions, expected value, variance, and independence.

Session 4: Basics of Probability Statistics (Bayesian probability, probability distribution) 11/6

Lecture will be given on Bayesian probability. Bayesian probability theory related to Bayes’ theorem will be treated exhaustively, and probability distributions will be briefly reviewed. In addition, students will experience concrete calculation methods by obtaining posterior distributions of parameters and predictive distributions for unknown data using Bayesian inference methods for polynomial regression, which will be treated in the second session.

Session 5: Likelihood, logistic regression  11/13

The concept of likelihood, which is very important for understanding generative models, will be explained and probabilistic models using Bayesian logistic regression will be discussed. The prerequisites for the extension to variational inference are explained here, corresponding to Chapter 4 of the PRML volume.

Session 6: EM Algorithm 12/4

After explaining k-means clustering, the most fundamental concept in clustering, the lecture will discuss its relationship to the EM algorithm. The discussion will be extended from the EM algorithm for mixed Gaussian distributions to the general EM algorithm.

Session 7 Variational Reasoning  12/11

Approximating the posterior distribution is one of the most fundamental problems in machine learning. We will understand the framework for deterministic approximate inference. Understand how to use the EM algorithm, which we did in the 6th session, and grasp methods for approximating the full posterior distribution for all random variables. Aim to understand why it is “deterministic” and “probabilistic,” corresponding to PRML Chapter 10.

Session 8: Variational Inference ,12/18

Continuing from Part 7, this lecture will deepen students’ understanding of variational inference methods. In this session, in contrast to the 7th session, we will deepen our understanding of local methods that are local, i.e., inference on each variable of the model and functions of the variable army. This session will provide an understanding of hyperparameter inference from data and the EP method, which is a practical algorithm for approximate inference and corresponds to PRML Chapter 10.

Session 9:Approximate Inference (MCMC, Metropolis-Hasting)

The sampling method, which is an important approximate computation method for evaluating the expected value of a function under a certain probability distribution, will be organized. After touching on basic sampling algorithms, this lecture aims to deepen understanding of Markov chain Monte Carlo methods using Markovianity, corresponding to PRML Chapter 11.

Session 10:Approximate Inference (Gibbs sampling)

 Students will understand Gibbs sampling, an extension of the Metropolis-Hasting method. The lecture will also explain when each sampling method is effective. If there is enough time, more advanced topics will be provided in addition to those already covered in the lecture.

ページトップ