Chevron Left
Machine Learning: Classification に戻る

ワシントン大学(University of Washington) による Machine Learning: Classification の受講者のレビューおよびフィードバック

4.7
2,820件の評価
470件のレビュー

コースについて

Case Studies: Analyzing Sentiment & Loan Default Prediction In our case study on analyzing sentiment, you will create models that predict a class (positive/negative sentiment) from input features (text of the reviews, user profile information,...). In our second case study for this course, loan default prediction, you will tackle financial data, and predict when a loan is likely to be risky or safe for the bank. These tasks are an examples of classification, one of the most widely used areas of machine learning, with a broad array of applications, including ad targeting, spam detection, medical diagnosis and image classification. In this course, you will create classifiers that provide state-of-the-art performance on a variety of tasks. You will become familiar with the most successful techniques, which are most widely used in practice, including logistic regression, decision trees and boosting. In addition, you will be able to design and implement the underlying algorithms that can learn these models at scale, using stochastic gradient ascent. You will implement these technique on real-world, large-scale machine learning tasks. You will also address significant tasks you will face in real-world applications of ML, including handling missing data and measuring precision and recall to evaluate a classifier. This course is hands-on, action-packed, and full of visualizations and illustrations of how these techniques will behave on real data. We've also included optional content in every module, covering advanced topics for those who want to go even deeper! Learning Objectives: By the end of this course, you will be able to: -Describe the input and output of a classification model. -Tackle both binary and multiclass classification problems. -Implement a logistic regression model for large-scale classification. -Create a non-linear model using decision trees. -Improve the performance of any model using boosting. -Scale your methods with stochastic gradient ascent. -Describe the underlying decision boundaries. -Build a classification model to predict sentiment in a product review dataset. -Analyze financial data to predict loan defaults. -Use techniques for handling missing data. -Evaluate your models using precision-recall metrics. -Implement these techniques in Python (or in the language of your choice, though Python is highly recommended)....

人気のレビュー

SS

Oct 16, 2016

Hats off to the team who put the course together! Prof Guestrin is a great teacher. The course gave me in-depth knowledge regarding classification and the math and intuition behind it. It was fun!

CJ

Jan 25, 2017

Very impressive course, I would recommend taking course 1 and 2 in this specialization first since they skip over some things in this course that they have explained thoroughly in those courses

フィルター:

Machine Learning: Classification: 101 - 125 / 440 レビュー

by Marcus V M d S

Oct 16, 2017

Another great course from this specialization. Tremendous effort in making the notebooks and assignments. I just think there could be recommended readings also.

by Sauvage F

Mar 29, 2016

Excellent Course, I'm very found of Carlos jokes mixed with the hard concepts ^^. Lectures are precise, concise and comprehensive. I really enjoyed diving in depths of the algorithms' mechanics (like Emily did in the Regression Course). I also deeply appreciated the real-world examples in the lectures and real world datasets of assignments.

Some may regret the absence of a few "classic" algorithms like SVM but Carlos definitely made his point about it in the forum and did not exclude the addition of an optional module about it.

I found some of the assignments less challenging than during the Regression Course, but maybe I'm just getting better at Machine-Learning and Python ^^.

Thanks again to Emily and Carlos for the brilliant work at this very promising specialization.

by Dmitri T

Apr 25, 2016

Really liked the practical application of this course - very useful in learning classification methods.

by Saransh A

Oct 31, 2016

Well this series just doesn't seize to amaze me! Another great course after the introductory and regression course. Though I really missed detailed explanations of Random Forest and other Ensemble methods. Also, SVM was not discussed, but there were many other topics which all other courses and books easily skips. The programming assignments were fine, more focused on teaching the algorithms than trapping someone in the coding part. This series is the series for someone who really wants to get a hold of what machine learning really is. One thing which I really like about this course is that there are optional videos from time to time, where they discuss the mathematical aspects of the algorithms that they teach. Which really quenches my thirst for mathematical rigour. Definitely continuing this specialisation forward

by Rajat S B

Jun 13, 2016

Great course , It gives the idea of how we should do classification from scratch as well as understanding the concept of how to handle large dataset during training. Boosting is one of the most important technique as what I have heard in machine learning and it's great to understand the concept of it.

by Ahmed N A

May 04, 2018

The best course I could find to get a strong hold of the basics of machine learning. Presented in very easy to follow steps with thorough coverage of all the concepts necessary to understand the big picture of each algorithm.

by 海上机械师

Sep 13, 2016

So cool and much practical.

by Venkata D

Apr 14, 2016

Great course and learning

by Marcio R

Jun 14, 2016

Curso excelente, desde o material, as atividades práticas e aulas. O fórum de discussões é repleto de pessoas interessadas em ajudar. Essa é a especialização a longa distância definitiva de Machine Learning.

by TONGHONG C

Jun 14, 2017

Best ML course I've ever taken!

by Jason M C

Mar 29, 2016

This continues UWash's outstanding Machine Learning series of classes, and is equally as impressive, if not moreso, then the Regression class it follows. I'm super-excited for the next class!

by Rahul G

May 06, 2017

Excellent course except that week 7 th assignment based on ipynb notebook had some redundant questions. Otherwise a good course especially sheds light on Adaboost, ensemble classifiers and stochastic gradient with batch processing.

Thanks Professor Carlos.

by Patrick M

Aug 08, 2016

Excellent course. Great mix of theory overview coupled with practical examples to work through.

by Farrukh N A

Feb 10, 2017

I found carols to be the best instructor in machine learning domain, he presented the algorithms and all core machine learning concepts in really great way.

by Etienne V

Nov 13, 2016

Great course with very good material! I'd like to see assignments that leaves more coding tasks to the student.

by Paulo R M B

Jan 31, 2017

Well explaned !

by Evaldas B

Dec 14, 2017

Very nice course with a little bit of details about how classification is done. Enjoyed it.

by 李真

Mar 06, 2016

great

by Aditi R

Oct 20, 2016

Wonderful experience. Prof is very good.

by Colin B

Apr 09, 2017

Really interesting course, as usual.

by Daniel Z

Mar 08, 2016

This is a hand-on very exciting course, strongly recommended for all audience

by andreas c c

Aug 16, 2017

The course is demanding but I learn a lot in classification.

The teachers are awesome!

by eric g

Mar 21, 2016

The best part for me in this specialization, Classification is great

by Youssef R

Aug 23, 2017

This is really a wonderfull course, and i recommend it to anyone who want to master some important techniques in the trending field of machine learning

by Liang-Yao W

Aug 12, 2017

The course walk through (and work through) concepts of linear classifier, logistic regression, decision trees, boosting, etc. For me it is a good introduction to these fundamental ideas with depth but not too deep to be distracted.

I personally become interested in knowing a bit more theoretical basis of the tools or concepts like boosting or maximum likelihood. The course understandably doesn't go that much into math and theory which leaves me a bit unsatisfied :P. But that is probably too much to ask for a short course and I do think the course covers great materials already.