A very deep and comprehensive course for learning some of the core fundamentals of Machine Learning. Can get a bit frustrating at times because of numerous assignments :P but a fun thing overall :)
Hats off to the team who put the course together! Prof Guestrin is a great teacher. The course gave me in-depth knowledge regarding classification and the math and intuition behind it. It was fun!
by PRASAD N•
by Ayswarya S•
by Alberto J L R•
by Syamsul B•
by VIGNESHKUMAR R•
by Serge B•
by IDOWU H A•
by Ole H S•
First. I like these courses allot. They are pretty close to covering just what you need to actually do machine learning in the real world and not dive too deep into topics that have no practical value.
This course was a bit too thin, the last 4 weeks of the course contained little in depth informations and seemed to brush over allot of different topics that could have contained more information. Although they where important topics the course could go more in depth on at least 3 or 4 of those topics. The last 3 weeks could have been a course on its own if properly explored. However the concepts are well enough covered to be usable in practice i belive.
The programming exercises where ridiculously simple. Everything was reduced to filling in 1 or two lines in a bigger function. I understand that the point was to see how these functions are made and that it increases our understanding of the algorithms already existing in packages like schikit-learn and graphlab. Also the content became a bit too repetetive (actually started in the second course but continues in this course). The time used on variation over the same topic in different models made it challenging to pay attention when the lecture finally came to a new point (brain fell a sleep while waiting for something new).
by Ryan M•
While I feel like I have a good theoretical understanding of the issues involved in classification, with an understanding of how the algorithms work and how to implement them, this course could have prepared me better to attack an actual problem by following a real case study through, showing me what steps someone with experience in attacking real problems would take in order to come up with a good classifier.
In particular, while a number of classifiers were presented, there was little to no discussion of the relative advantages and disadvantages of each algorithm. In what cases should I choose logistic regression? A decision tree or a boosted decision tree?
Finally, it seems that random forests and support vector machines are common classifiers, and this course did not cover them. I instead had to learn about random forests (a relatively simple concept that could have been included with the boosted decision tree content) from scikit-learn's web site.
by Ziyue Z•
Compared with the regression course, this course was a slight disappointment. 1. there is less material compared to the regression course. Maybe this is because classification concepts are more intuitive. 2. the slides are much less prepared. Some of the sides even re-use earlier lesson slides in the beginning as a "review", much like soap operas re-use scenes from earlier episodes as "memory recall" to fill air time. 3. the math is more handwavy compared to the regression course. Neither course are supposed to go in depth with proofs, but I felt the regression course was at the right level and this course degraded too far. Do note it's very possible that I'm biased because I have seen more of the material from this course than the regression course.
by Sunil N•
Bit of skewed distribution of load of work. Like week 6 and 7 were extremely light (merely 1 hour work), while week 2 and 5 were too heavy for a week. Syntax errors in assignment notebooks kept the nerves active but can be bit frustrating for relatively naive or trusting candidates, who might end up spending a lot of time finding bugs in their own piece of code. Overall a nice experience. Covid and wfh situation is not allowing proper time for learning but reminders helped in meeting the goal. Thank you
Turi stopped working on SFrame (at least on Github), and SFrame does not supports Python 3. Expect some difficulty if you use other tools like pandas - the programming assignment completely assumes you use SFrame. Fortunately data of csv format is provided, so you can complete it anyway but again, don't expect a smooth ride.
Also the lecture tends to cover general concepts than mathematical details. I don't like it, but that would be a good point to the starters.
by Tom L•
Well, after the regression course, which I actually found interesting, the classification course doesn't look so good. The programming assignments are mostly pointless. The use of graphlab doesn't make it better. The info presented in this course is rather superficial. If you're entirely new to machine learning, you could find some value in this course. If not, go buy a good book.
by Oliverio J S J•
At first the course seems interesting but, as it progresses, it fails to convey why these contents are important in the deep learning era. In addition, it seems quite obvious that some contents are missing; I suppose that they have been eliminated due to the same problems that forced the cancellation of the last specialization courses.
The material is good, but the choice of using GraphLab Create is a poor one. It's not used in the industry and it's poorly supported. I had issues installing it both via command line and via the installer, so I ended up using the AWS machine. But that has it's own drawbacks, such as the slowness and the setup time.
by Nitzan O•
The course is interesting and well taught. The professor is very enthusiastic and it makes the course fun to watch. The problem in my opinion is that the content is too superficial. It's completely lack of mathematical background and the programming exercises are sometimes no more than copy paste.
by ANIMESH M•
The course is up to the mark but what i felt missing is about the coding . They didn't focus on implementation tasks simply gave the notebooks for the assignments.
Also S.V.M and random forest classifiers are missing.
From my side concluding all the experience , i will give a 6.5 out of 10.
by Kumar B•
This course covers the basics of classification very well, but I would have liked optional sections on more advanced topics. Some of the quiz questions were a bit confusing. It would have been good if the exercises also dealt with unbalanced data sets in more detail.
by Neelkanth S M•
The content is good but completing assignments is a real pain because they choose to deploy a unstable proprietary python library, which gives hard time installing and running (as of Q1 2019). The entire learning experience is marred by this Graphlab python library.
by D B•
Pros: Absolutely fantastic theory explanations. Establishes solid fundamentals. Cons: The bugs in test/notebooks could have not been rectified with new ones. Demands searching in discussion forum every time. Would highly recommend for starters!
by ANGELICA D C•
Finalizo siendo muy confuso. El conocimiento de los videos opcionales no se le daba seguimiento, hasta el final en las tareas es cuando se usaba pero ya estaba fuera de contexto y era difícil entender.
by Supharerk T•
All of the courses lecture are great until it reaches week 5 where it's really hard to catch, the programming assignment doesn't give enough hints and lecture in this topic doesn't help much.
by nazar p•
While courses 1 and 2 of this specialization were quite good, I find this one a bit sparse on content. I think this course could be easily compressed into 2-3 weeks instead of 7.
by Rohit J•
A lot of interesting parts of the course are available as optional and a lot of the difficult parts of the coding exercises are provided to you - the challenge is not there. :/
by Ilan S•
The videos were pretty goods. But a bit too slow and easy. The assigments were ok, but too guiding. Also there were too much reimplementation of algorithm