このコースについて
70,836 最近の表示

次における7の3コース

100%オンライン

自分のスケジュールですぐに学習を始めてください。

柔軟性のある期限

スケジュールに従って期限をリセットします。

上級レベル

Course requires strong background in calculus, linear algebra, probability theory and machine learning.

約39時間で修了

推奨:6 weeks of study, 6 hours/week...

英語

字幕:英語, 韓国語

習得するスキル

Bayesian OptimizationGaussian ProcessMarkov Chain Monte Carlo (MCMC)Variational Bayesian Methods

次における7の3コース

100%オンライン

自分のスケジュールですぐに学習を始めてください。

柔軟性のある期限

スケジュールに従って期限をリセットします。

上級レベル

Course requires strong background in calculus, linear algebra, probability theory and machine learning.

約39時間で修了

推奨:6 weeks of study, 6 hours/week...

英語

字幕:英語, 韓国語

シラバス - 本コースの学習内容

1
2時間で修了

Introduction to Bayesian methods & Conjugate priors

Welcome to first week of our course! Today we will discuss what bayesian methods are and what are probabilistic models. We will see how they can be used to model real-life situations and how to make conclusions from them. We will also learn about conjugate priors — a class of models where all math becomes really simple....
9件のビデオ (合計55分), 1 reading, 2 quizzes
9件のビデオ
Bayesian approach to statistics5 分
How to define a model3 分
Example: thief & alarm11 分
Linear regression10 分
Analytical inference3 分
Conjugate distributions2 分
Example: Normal, precision5 分
Example: Bernoulli4 分
1件の学習用教材
MLE estimation of Gaussian mean10 分
2の練習問題
Introduction to Bayesian methods20 分
Conjugate priors12 分
2
6時間で修了

Expectation-Maximization algorithm

This week we will about the central topic in probabilistic modeling: the Latent Variable Models and how to train them, namely the Expectation Maximization algorithm. We will see models for clustering and dimensionality reduction where Expectation Maximization algorithm can be applied as is. In the following weeks, we will spend weeks 3, 4, and 5 discussing numerous extensions to this algorithm to make it work for more complicated models and scale to large datasets....
17件のビデオ (合計168分), 3 quizzes
17件のビデオ
Probabilistic clustering6 分
Gaussian Mixture Model10 分
Training GMM10 分
Example of GMM training10 分
Jensen's inequality & Kullback Leibler divergence9 分
Expectation-Maximization algorithm10 分
E-step details12 分
M-step details6 分
Example: EM for discrete mixture, E-step10 分
Example: EM for discrete mixture, M-step12 分
Summary of Expectation Maximization6 分
General EM for GMM12 分
K-means from probabilistic perspective9 分
K-means, M-step7 分
Probabilistic PCA13 分
EM for Probabilistic PCA7 分
2の練習問題
EM algorithm8 分
Latent Variable Models and EM algorithm10 分
3
2時間で修了

Variational Inference & Latent Dirichlet Allocation

This week we will move on to approximate inference methods. We will see why we care about approximating distributions and see variational inference — one of the most powerful methods for this task. We will also see mean-field approximation in details. And apply it to text-mining algorithm called Latent Dirichlet Allocation...
11件のビデオ (合計98分), 2 quizzes
11件のビデオ
Mean field approximation13 分
Example: Ising model15 分
Variational EM & Review5 分
Topic modeling5 分
Dirichlet distribution6 分
Latent Dirichlet Allocation5 分
LDA: E-step, theta11 分
LDA: E-step, z8 分
LDA: M-step & prediction13 分
Extensions of LDA5 分
2の練習問題
Variational inference15 分
Latent Dirichlet Allocation15 分
4
5時間で修了

Markov chain Monte Carlo

This week we will learn how to approximate training and inference with sampling and how to sample from complicated distributions. This will allow us to build simple method to deal with LDA and with Bayesian Neural Networks — Neural Networks which weights are random variables themselves and instead of training (finding the best value for the weights) we will sample from the posterior distributions on weights....
11件のビデオ (合計122分), 2 quizzes
11件のビデオ
Sampling from 1-d distributions13 分
Markov Chains13 分
Gibbs sampling12 分
Example of Gibbs sampling7 分
Metropolis-Hastings8 分
Metropolis-Hastings: choosing the critic8 分
Example of Metropolis-Hastings9 分
Markov Chain Monte Carlo summary8 分
MCMC for LDA15 分
Bayesian Neural Networks11 分
1の練習問題
Markov Chain Monte Carlo20 分
5
5時間で修了

Variational Autoencoder

Welcome to the fifth week of the course! This week we will combine many ideas from the previous weeks and add some new to build Variational Autoencoder -- a model that can learn a distribution over structured data (like photographs or molecules) and then sample new data points from the learned distribution, hallucinating new photographs of non-existing people. We will also the same techniques to Bayesian Neural Networks and will see how this can greatly compress the weights of the network without reducing the accuracy....
10件のビデオ (合計79分), 3 readings, 3 quizzes
10件のビデオ
Modeling a distribution of images10 分
Using CNNs with a mixture of Gaussians8 分
Scaling variational EM15 分
Gradient of decoder6 分
Log derivative trick6 分
Reparameterization trick7 分
Learning with priors5 分
Dropout as Bayesian procedure5 分
Sparse variational dropout5 分
3件の学習用教材
VAE paper10 分
Relevant papers10 分
Categorical Reparametrization with Gumbel-Softmax10 分
2の練習問題
Variational autoencoders16 分
Categorical Reparametrization with Gumbel-Softmax18 分
6
4時間で修了

Gaussian processes & Bayesian optimization

Welcome to the final week of our course! This time we will see nonparametric Bayesian methods. Specifically, we will learn about Gaussian processes and their application to Bayesian optimization that allows one to perform optimization for scenarios in which each function evaluation is very expensive: oil probe, drug discovery and neural network architecture tuning....
7件のビデオ (合計58分), 2 quizzes
7件のビデオ
Gaussian processes8 分
GP for machine learning5 分
Derivation of main formula11 分
Nuances of GP12 分
Bayesian optimization10 分
Applications of Bayesian optimization5 分
1の練習問題
Gaussian Processes and Bayesian Optimization16 分
5時間で修了

Final project

In this module you will apply methods that you learned in this course to this final project...
1 quiz
4.6
91件のレビューChevron Right

60%

コース終了後に新しいキャリアをスタートした

36%

コースが具体的なキャリアアップにつながった

人気のレビュー

by JGNov 18th 2017

This course is little difficult. But I could find very helpful.\n\nAlso, I didn't find better course on Bayesian anywhere on the net. So I will recommend this if anyone wants to die into bayesian.

by VOApr 3rd 2019

Great introduction to Bayesian methods, with quite good hands on assignments. This course will definitely be the first step towards a rigorous study of the field.

講師

Avatar

Daniil Polykovskiy

Researcher
HSE Faculty of Computer Science
Avatar

Alexander Novikov

Researcher
HSE Faculty of Computer Science

ロシア国立研究大学経済高等学院(National Research University Higher School of Economics)について

National Research University - Higher School of Economics (HSE) is one of the top research universities in Russia. Established in 1992 to promote new research and teaching in economics and related disciplines, it now offers programs at all levels of university education across an extraordinary range of fields of study including business, sociology, cultural studies, philosophy, political science, international relations, law, Asian studies, media and communicamathematics, engineering, and more. Learn more on www.hse.ru...

Advanced Machine Learningの専門講座について

This specialization gives an introduction to deep learning, reinforcement learning, natural language understanding, computer vision and Bayesian methods. Top Kaggle machine learning practitioners and CERN scientists will share their experience of solving real-world problems and help you to fill the gaps between theory and practice. Upon completion of 7 courses you will be able to apply modern machine learning methods in enterprise and understand the caveats of real-world data and settings....
Advanced Machine Learning

よくある質問

  • 修了証に登録すると、すべてのビデオ、テスト、およびプログラミング課題(該当する場合)にアクセスできます。ピアレビュー課題は、セッションが開始してからのみ、提出およびレビューできます。購入せずにコースを検討することを選択する場合、特定の課題にアクセスすることはできません。

  • コースに登録する際、専門講座のすべてのコースにアクセスできます。コースの完了時には修了証を取得できます。電子修了証が成果のページに追加され、そこから修了証を印刷したり、LinkedInのプロフィールに追加したりできます。コースの内容の閲覧のみを希望する場合は、無料でコースを聴講できます。

  • Course requires strong background in calculus, linear algebra, probability theory and machine learning.

さらに質問がある場合は、受講者向けヘルプセンターにアクセスしてください。