このコースについて
54,706 最近の表示

次における4の3コース

100%オンライン

自分のスケジュールですぐに学習を始めてください。

柔軟性のある期限

スケジュールに従って期限をリセットします。

約43時間で修了

推奨:7 weeks of study, 5-8 hours/week...

英語

字幕:英語, 韓国語, アラビア語

習得するスキル

Logistic RegressionStatistical ClassificationClassification AlgorithmsDecision Tree

次における4の3コース

100%オンライン

自分のスケジュールですぐに学習を始めてください。

柔軟性のある期限

スケジュールに従って期限をリセットします。

約43時間で修了

推奨:7 weeks of study, 5-8 hours/week...

英語

字幕:英語, 韓国語, アラビア語

シラバス - 本コースの学習内容

1
1時間で修了

Welcome!

Classification is one of the most widely used techniques in machine learning, with a broad array of applications, including sentiment analysis, ad targeting, spam detection, risk assessment, medical diagnosis and image classification. The core goal of classification is to predict a category or class y from some inputs x. Through this course, you will become familiar with the fundamental models and algorithms used in classification, as well as a number of core machine learning concepts. Rather than covering all aspects of classification, you will focus on a few core techniques, which are widely used in the real-world to get state-of-the-art performance. By following our hands-on approach, you will implement your own algorithms on multiple real-world tasks, and deeply grasp the core techniques needed to be successful with these approaches in practice. This introduction to the course provides you with an overview of the topics we will cover and the background knowledge and resources we assume you have....
8件のビデオ (合計27分), 3 readings
8件のビデオ
What is this course about?6 分
Impact of classification1 分
Course overview3 分
Outline of first half of course5 分
Outline of second half of course5 分
Assumed background3 分
Let's get started!45
3件の学習用教材
Important Update regarding the Machine Learning Specialization10 分
Slides presented in this module10 分
Reading: Software tools you'll need10 分
2時間で修了

Linear Classifiers & Logistic Regression

Linear classifiers are amongst the most practical classification methods. For example, in our sentiment analysis case-study, a linear classifier associates a coefficient with the counts of each word in the sentence. In this module, you will become proficient in this type of representation. You will focus on a particularly useful type of linear classifier called logistic regression, which, in addition to allowing you to predict a class, provides a probability associated with the prediction. These probabilities are extremely useful, since they provide a degree of confidence in the predictions. In this module, you will also be able to construct features from categorical inputs, and to tackle classification problems with more than two class (multiclass problems). You will examine the results of these techniques on a real-world product sentiment analysis task....
18件のビデオ (合計78分), 2 readings, 2 quizzes
18件のビデオ
Intuition behind linear classifiers3 分
Decision boundaries3 分
Linear classifier model5 分
Effect of coefficient values on decision boundary2 分
Using features of the inputs2 分
Predicting class probabilities1 分
Review of basics of probabilities6 分
Review of basics of conditional probabilities8 分
Using probabilities in classification2 分
Predicting class probabilities with (generalized) linear models5 分
The sigmoid (or logistic) link function4 分
Logistic regression model5 分
Effect of coefficient values on predicted probabilities7 分
Overview of learning logistic regression models2 分
Encoding categorical inputs4 分
Multiclass classification with 1 versus all7 分
Recap of logistic regression classifier1 分
2件の学習用教材
Slides presented in this module10 分
Predicting sentiment from product reviews10 分
2の練習問題
Linear Classifiers & Logistic Regression10 分
Predicting sentiment from product reviews24 分
2
2時間で修了

Learning Linear Classifiers

Once familiar with linear classifiers and logistic regression, you can now dive in and write your first learning algorithm for classification. In particular, you will use gradient ascent to learn the coefficients of your classifier from data. You first will need to define the quality metric for these tasks using an approach called maximum likelihood estimation (MLE). You will also become familiar with a simple technique for selecting the step size for gradient ascent. An optional, advanced part of this module will cover the derivation of the gradient for logistic regression. You will implement your own learning algorithm for logistic regression from scratch, and use it to learn a sentiment analysis classifier....
18件のビデオ (合計83分), 2 readings, 2 quizzes
18件のビデオ
Intuition behind maximum likelihood estimation4 分
Data likelihood8 分
Finding best linear classifier with gradient ascent3 分
Review of gradient ascent6 分
Learning algorithm for logistic regression3 分
Example of computing derivative for logistic regression5 分
Interpreting derivative for logistic regression5 分
Summary of gradient ascent for logistic regression2 分
Choosing step size5 分
Careful with step sizes that are too large4 分
Rule of thumb for choosing step size3 分
(VERY OPTIONAL) Deriving gradient of logistic regression: Log trick4 分
(VERY OPTIONAL) Expressing the log-likelihood3 分
(VERY OPTIONAL) Deriving probability y=-1 given x2 分
(VERY OPTIONAL) Rewriting the log likelihood into a simpler form8 分
(VERY OPTIONAL) Deriving gradient of log likelihood8 分
Recap of learning logistic regression classifiers1 分
2件の学習用教材
Slides presented in this module10 分
Implementing logistic regression from scratch10 分
2の練習問題
Learning Linear Classifiers12 分
Implementing logistic regression from scratch16 分
2時間で修了

Overfitting & Regularization in Logistic Regression

As we saw in the regression course, overfitting is perhaps the most significant challenge you will face as you apply machine learning approaches in practice. This challenge can be particularly significant for logistic regression, as you will discover in this module, since we not only risk getting an overly complex decision boundary, but your classifier can also become overly confident about the probabilities it predicts. In this module, you will investigate overfitting in classification in significant detail, and obtain broad practical insights from some interesting visualizations of the classifiers' outputs. You will then add a regularization term to your optimization to mitigate overfitting. You will investigate both L2 regularization to penalize large coefficient values, and L1 regularization to obtain additional sparsity in the coefficients. Finally, you will modify your gradient ascent algorithm to learn regularized logistic regression classifiers. You will implement your own regularized logistic regression classifier from scratch, and investigate the impact of the L2 penalty on real-world sentiment analysis data....
13件のビデオ (合計66分), 2 readings, 2 quizzes
13件のビデオ
Review of overfitting in regression3 分
Overfitting in classification5 分
Visualizing overfitting with high-degree polynomial features3 分
Overfitting in classifiers leads to overconfident predictions5 分
Visualizing overconfident predictions4 分
(OPTIONAL) Another perspecting on overfitting in logistic regression8 分
Penalizing large coefficients to mitigate overfitting5 分
L2 regularized logistic regression4 分
Visualizing effect of L2 regularization in logistic regression5 分
Learning L2 regularized logistic regression with gradient ascent7 分
Sparse logistic regression with L1 regularization7 分
Recap of overfitting & regularization in logistic regression58
2件の学習用教材
Slides presented in this module10 分
Logistic Regression with L2 regularization10 分
2の練習問題
Overfitting & Regularization in Logistic Regression16 分
Logistic Regression with L2 regularization16 分
3
2時間で修了

Decision Trees

Along with linear classifiers, decision trees are amongst the most widely used classification techniques in the real world. This method is extremely intuitive, simple to implement and provides interpretable predictions. In this module, you will become familiar with the core decision trees representation. You will then design a simple, recursive greedy algorithm to learn decision trees from data. Finally, you will extend this approach to deal with continuous inputs, a fundamental requirement for practical problems. In this module, you will investigate a brand new case-study in the financial sector: predicting the risk associated with a bank loan. You will implement your own decision tree learning algorithm on real loan data....
13件のビデオ (合計47分), 3 readings, 3 quizzes
13件のビデオ
Intuition behind decision trees1 分
Task of learning decision trees from data3 分
Recursive greedy algorithm4 分
Learning a decision stump3 分
Selecting best feature to split on6 分
When to stop recursing4 分
Making predictions with decision trees1 分
Multiclass classification with decision trees2 分
Threshold splits for continuous inputs6 分
(OPTIONAL) Picking the best threshold to split on3 分
Visualizing decision boundaries5 分
Recap of decision trees56
3件の学習用教材
Slides presented in this module10 分
Identifying safe loans with decision trees10 分
Implementing binary decision trees10 分
3の練習問題
Decision Trees22 分
Identifying safe loans with decision trees14 分
Implementing binary decision trees14 分
4
2時間で修了

Preventing Overfitting in Decision Trees

Out of all machine learning techniques, decision trees are amongst the most prone to overfitting. No practical implementation is possible without including approaches that mitigate this challenge. In this module, through various visualizations and investigations, you will investigate why decision trees suffer from significant overfitting problems. Using the principle of Occam's razor, you will mitigate overfitting by learning simpler trees. At first, you will design algorithms that stop the learning process before the decision trees become overly complex. In an optional segment, you will design a very practical approach that learns an overly-complex tree, and then simplifies it with pruning. Your implementation will investigate the effect of these techniques on mitigating overfitting on our real-world loan data set. ...
8件のビデオ (合計40分), 2 readings, 2 quizzes
8件のビデオ
Overfitting in decision trees5 分
Principle of Occam's razor: Learning simpler decision trees5 分
Early stopping in learning decision trees6 分
(OPTIONAL) Motivating pruning8 分
(OPTIONAL) Pruning decision trees to avoid overfitting6 分
(OPTIONAL) Tree pruning algorithm3 分
Recap of overfitting and regularization in decision trees1 分
2件の学習用教材
Slides presented in this module10 分
Decision Trees in Practice10 分
2の練習問題
Preventing Overfitting in Decision Trees22 分
Decision Trees in Practice28 分
1時間で修了

Handling Missing Data

Real-world machine learning problems are fraught with missing data. That is, very often, some of the inputs are not observed for all data points. This challenge is very significant, happens in most cases, and needs to be addressed carefully to obtain great performance. And, this issue is rarely discussed in machine learning courses. In this module, you will tackle the missing data challenge head on. You will start with the two most basic techniques to convert a dataset with missing data into a clean dataset, namely skipping missing values and inputing missing values. In an advanced section, you will also design a modification of the decision tree learning algorithm that builds decisions about missing data right into the model. You will also explore these techniques in your real-data implementation. ...
6件のビデオ (合計25分), 1 reading, 1 quiz
6件のビデオ
Strategy 1: Purification by skipping missing data4 分
Strategy 2: Purification by imputing missing data4 分
Modifying decision trees to handle missing data4 分
Feature split selection with missing data5 分
Recap of handling missing data1 分
1件の学習用教材
Slides presented in this module10 分
1の練習問題
Handling Missing Data14 分
4.7
451件のレビューChevron Right

48%

コース終了後に新しいキャリアをスタートした

46%

コースが具体的なキャリアアップにつながった

13%

昇給や昇進につながった

人気のレビュー

by SSOct 16th 2016

Hats off to the team who put the course together! Prof Guestrin is a great teacher. The course gave me in-depth knowledge regarding classification and the math and intuition behind it. It was fun!

by CJJan 25th 2017

Very impressive course, I would recommend taking course 1 and 2 in this specialization first since they skip over some things in this course that they have explained thoroughly in those courses

講師

Avatar

Carlos Guestrin

Amazon Professor of Machine Learning
Computer Science and Engineering
Avatar

Emily Fox

Amazon Professor of Machine Learning
Statistics

ワシントン大学(University of Washington)について

Founded in 1861, the University of Washington is one of the oldest state-supported institutions of higher education on the West Coast and is one of the preeminent research universities in the world....

機械学習の専門講座について

This Specialization from leading researchers at the University of Washington introduces you to the exciting, high-demand field of Machine Learning. Through a series of practical case studies, you will gain applied experience in major areas of Machine Learning including Prediction, Classification, Clustering, and Information Retrieval. You will learn to analyze large and complex datasets, create systems that adapt and improve over time, and build intelligent applications that can make predictions from data....
機械学習

よくある質問

  • 修了証に登録すると、すべてのビデオ、テスト、およびプログラミング課題(該当する場合)にアクセスできます。ピアレビュー課題は、セッションが開始してからのみ、提出およびレビューできます。購入せずにコースを検討することを選択する場合、特定の課題にアクセスすることはできません。

  • コースに登録する際、専門講座のすべてのコースにアクセスできます。コースの完了時には修了証を取得できます。電子修了証が成果のページに追加され、そこから修了証を印刷したり、LinkedInのプロフィールに追加したりできます。コースの内容の閲覧のみを希望する場合は、無料でコースを聴講できます。

さらに質問がある場合は、受講者向けヘルプセンターにアクセスしてください。