Chevron Left
Back to Probabilistic Graphical Models 1: Representation

Learner Reviews & Feedback for Probabilistic Graphical Models 1: Representation by Stanford University

4.6
stars
1,424 ratings

About the Course

Probabilistic graphical models (PGMs) are a rich framework for encoding probability distributions over complex domains: joint (multivariate) distributions over large numbers of random variables that interact with each other. These representations sit at the intersection of statistics and computer science, relying on concepts from probability theory, graph algorithms, machine learning, and more. They are the basis for the state-of-the-art methods in a wide variety of applications, such as medical diagnosis, image understanding, speech recognition, natural language processing, and many, many more. They are also a foundational tool in formulating many machine learning problems. This course is the first in a sequence of three. It describes the two basic PGM representations: Bayesian Networks, which rely on a directed graph; and Markov networks, which use an undirected graph. The course discusses both the theoretical properties of these representations as well as their use in practice. The (highly recommended) honors track contains several hands-on assignments on how to represent some real-world problems. The course also presents some important extensions beyond the basic PGM representation, which allow more complex models to be encoded compactly....

Top reviews

RG

Jul 12, 2017

Prof. Koller did a great job communicating difficult material in an accessible manner. Thanks to her for starting Coursera and offering this advanced course so that we can all learn...Kudos!!

CM

Oct 22, 2017

The course was deep, and well-taught. This is not a spoon-feeding course like some others. The only downside were some "mechanical" problems (e.g. code submission didn't work for me).

Filter by:

201 - 225 of 311 Reviews for Probabilistic Graphical Models 1: Representation

By Nairouz M

Feb 13, 2017

Very helpful.

By brotherzhao

Feb 15, 2020

nice course!

By Utkarsh A

Dec 30, 2018

maza aa gaya

By Musalula S

Aug 2, 2018

Great course

By Yuri F

May 15, 2017

great course

By 赵紫川

Nov 27, 2016

Nice course.

By Pedro R

Nov 9, 2016

great course

By Frank

Dec 14, 2017

老师太天马行空了。。。

By HOLLY W

May 24, 2019

课程特别好,资料丰富

By Siyeong L

Jan 21, 2017

Awesome!!!

By Alireza N

Jan 12, 2017

Excellent!

By dingjingtao

Jan 7, 2017

excellent!

By Phan T B

Dec 2, 2016

very good!

By Jax

Jan 8, 2017

very nice

By Jose A A S

Nov 25, 2016

Wonderful

By Mohammed O E A

Oct 18, 2016

Fantastic

By zhou

Oct 13, 2016

very good

By 张浩悦

Nov 22, 2018

funny!!

By Alexander A S G

Feb 9, 2017

Thanks

By oilover

Dec 2, 2016

老师很棒!!

By 刘仕琪

Oct 30, 2016

不错的一门课

By Accenture X

Oct 12, 2016

Great

By Ludovic P

Oct 29, 2017

I wish I could give 4 and a half star to this course.

On the positive side : there is a lot of value in this course. Professor Koller succeeds in introducing us to PGM representations in a few weeks. IMHO, one should really do all the exercices "for a mention". Without them, this course lacks "hands on" sessions, and is much less interesting. Most programming exercises are great, and the companion quiz are really a plus.

When I followed Professor Ng programming exercises, I was both delighted and frustrated. Delighted because I learned a lot of things. Frustrated because it was sometimes really too easy.

This is not the case for most exercices there. I find them so well prepared, so crafted that I often learned a lot of my first wrong submissions of quiz of programming exercices.

On the negative side : the quality of the sound recordings is sometimes not really good. That is especially true in the first videos. That should not stop you from following this great course ! Some programming exercices were a bit frustrating because their difficulty is more in knowing octave tips and tricks than in PGM. In addition, and this is more embarassing, some exercices do not work, like in Markov Network for OCR https://www.coursera.org/learn/probabilistic-graphical-models/programming/dZmtj/markov-networks-for-ocr I had, as other students, to disable some features and to blindly submit my ansmwers.

Also, some exercises were difficult for me because of very precise English. I guess it might be difficult for native speakers to handle that, but as this course seems to have an international audience, it would be great.

I feel that raising this great course from 4 stars to 5 stars would not require much efforts. Prepare better recordings of the few videos that have really bad sound. Correct those small bugs in exercises. Simplify some English wordings.

I, however, advise this course to all persons interested in this field. And I intend to follow the next course, on inference.

By Jonathan H

Jun 25, 2017

Excellent course. The video lectures are challenging (had to keep my finger on the pause key) even if you're familiar with the math, since the instructor encapsulates concepts in an amazingly concise manner. This pays off with a lot of "Aha!" moments as strong concepts are combined to create insights, especially starting around week 3. I'm already in love with this subject after 1 part

It would have been nice to have more worked homework problems, since this is a math course. But, this is not necessary to pass the class or understand the concepts. I've purchased Prof Koller's text on PGM and hope to solidify some of the intuitions I'm missing shortly.

Taking off a star because the test cases and grading software for the honors homework assessments were clearly low effort and sometimes incorrect. There were a lot of cases where functions passed all the provided and automatic test cases despite major flaws (e.g. not working for any cases besides n=1), which made it difficult to tell if things worked since the programming style is unique. The homework itself was super interesting and valuable, but I probably spent over 50% of the time fighting the grader instead of learning stuff. Given that I'm a professional programmer and completed most of the homework in 25-50% of the estimated time, I'm guessing that the average student wasted even more time with issues that are ultimately unrelated to our understanding of PGM.

By Hunter J

Jan 12, 2017

Before I took this course I took the Stanford Machine Learning course, which I greatly enjoyed. That course allows for the learning of difficult concepts in a way that I found less painful than working through a textbook. In this course there is a lot less video content, and the coding assignments are less interesting. Expect to spend a lot of time understanding the nuances of the code that the instructional team has developed, and be prepared to really pore over the gritty aspects of Octave or MATLAB. If you're serious about this course I suggest buying the accompanying book. The slides are not easy to understand without the audio narration, which makes them difficult to review, and unlike the case in the ML course, there are not a lot of readily available open introductions written on the topics.