I really enjoyed this course, it would be awesome to see al least one training example using GPU (maybe in Google Colab since not everyone owns one) so we could train the deepest networks from scratch
Great Course Overall\n\nOne thing is that some videos are not edited properly so Andrew repeats the same thing, again and again, other than that great and simple explanation of such complicated tasks.
by Isha J•
by Subhash A•
by VIGNESHKUMAR R•
by Rahila T•
by Naveen K•
by Panchal S V•
by CARLOS G G•
by Volodymyr M•
This is not an education in any way. Yes, Convolutional Neural Networks provides good overview of convolutional networks and technology behind it. I like the way Andrew Ng structured material and his way to explain some details. Unfortunately, as a common problem for all "Deep Learning Specialization", theoretical material only scratches the surface of the knowledge. There is nothing deep in terms of theory. You will have to spend quite a lot of time digging for information yourself if you plan to use course material for any practical task, or assignment. In order to get missing pieces, I got to go through whole Spring 2017 CS231n. It is fine if you have enough time to see two sets of videos, but I expected to get same quality of material here, on Coursera.
Another course issue is quizzes. I am puzzled what these quizzes are testing. Provided answers often assume tentatively more than one correct variant. Probability theory works against you - you may happen to select correct answers for some questions , but definitely, not all of them. In the same time, it is quite easy to derive correct variant from second try.
Course programming assignments are complete disaster. While I kind liked programming assignments from week 1 and 2, I felt like I wasted my time working on programming assignments from week 3 and 4. I expected programming assignment to guide me through some training of complex networks, give some practical insight, which I can use for real-life tasks, but it was not there.
There is a good introduction to TensorFlow, while Keras is not even touched. And many assignments of week 3 and 4 are using Keras. It is necessary to peek-up theory and practice regarding Keras elsewhere. After one get enough knowledge about Keras elsewhere - guess what - programming assignment becomes useless as education, because it is too trivial.
I really wanted to rate this course as Two-Stars, but video materials and programming assignments from week 1 and week 2 slightly improved my attitude.
by Yair S•
While the online teaching of Prof. Ng, is excellent as in the other courses, this course specifically, has several pitfalls which can not be ignored:
1) The teaching and cover being given for TensorFlow are by far insufficient. If this subject is seen as an essential part of the course, it must be instructed systematically but this is not the case, unfortunately. More often than not, you find yourself doing guesswork in the assignments when it comes to TF code, which is also reflected in the Discussion Forum. So to summarize, TF must be covered in a systematic way, either in this course, or a previous one.
2) There is a bug on week 4 NST assignment, on the given code. Should be fixed.
3) There are several written correction to errors in OnLine videos. These Videos can and should be rerecorded.
4) Last but certainly not least: I have experienced frequent and really disturbing connection problems with the Python Notebook, with frequent connection errors, which can not be recovered and wherein one must open again the Notebook. While this was, to some extent, the case in other courses, in this course it was much more of a problem, especially in Week 4, probably due to a large amount of data, and where each rerun requires another 20 - 30 minutes. a MUST fix.
by Kaitlin P•
This course let me down a bit. Like the other three in this sequence the content was great. Lectures were informative and I appreciate the detail that Andrew Ng goes into while talking about propagation. The pictures he draws are always instructive as well. It is not frequently you find instructors who are both experts in their field as well as know how to convey their knowledge to a broader audience.
Unfortunately, the production quality is not of the same standard as the previous courses. In the last three courses very occasionally would a sentence get repeated. Here it was, or seemed like, dozens of times. This can be very grating when listening to hours of lectures. Additionally, the homework grading system had a bug/error which resulted in lots of people being frustrated when trying to submit their work. While accidents happen the response of one moderators-"search this key word"-was not appreciated. I would certainly never tell my students to google something when I had made a mistake in the assignment. It was unhelpful, inappropriate given the mistake was on the creator's part, and borderline unprofessional. Then again, maybe the moderator was just English. I will finish the series but I sincerely hope the production quality is back to normal in the final RNN course.
by ehren e•
The material is getting a little stagnant in the courses. In a couple of the assignments there were code writing portions that used tensorflow 1.0. This version has been deprecated to 2.0 and the documentation links for the 1.0 examples and tips in the programming exercises appear to have been archived. It would be excellent if the course were refreshed some with more current frameworks. I know this can be challenging but i think 1.0 was deprecated some time ago now. Also, it would be very helpful if you offered a course on working with pretrained models (e.g. FaceNet, MxNet (ResNet) models, COCO models, etc) as this is a huge benefit in being able to get moving quickly. Andrew points this out in his lectures and focuses some time on it. Even if there were some pointers/guides to getting rolling quickly, even if it amounts to RTFD for some cases. This learning in this course has been excellent but i now see a gap to getting to the next stage of being able to pick up a pretrained model and extend it or apply for my own purpose.
by Arijit G•
Andrew is like my inspiration. The entire specialization is top notch, but somewhere or the other this course is quite off par. After going through the videos several times also things are still a mess.
I love how Andrew makes things easier but this needed more delicate tuning with a bit of programming included in the videos. How to go ahead with transfer learning, how to move on building our own ConvNets.
The course started superbly with the description of filters. How horizontal and vertical filters are made and everything but with the algorithms it went puff. A more delicate approach to teaching the complex models would have helped.
And this course really needed a fully-fledged assignment and not fill in the blanks type. Hopefully, the instructor takes note of this.
by Christian C•
Generally, I admired the "fundamental principles" approach in which the course was taught. It's helpful for those who want to understand CNNs from scratch.
On the other hand, there are some points for improvement. First, I think the programming exercises are insufficient: they are good for an entry-level experience of how the lectures are implemented, but I think they need some additional exercises (probably optional) that will focus more on practical settings. Second, I think it's time for the course to consider adapting to TensorFlow 2.x.x. Third, although this is rather too personal, I found the discussion on object detection too short.
Nevertheless, I would recommend this course to anyone who just wants to gain conceptual understanding on CNNs.
by Nathan W•
Of the classes offered by this source, this has really been the weakest. The editing errors (and tone in the background) were mechanically really grating, but the bigger issue is that the classes try to introduce tensorflow and keras. One of the strengths of the earlier units was they kept to matlab or numpy, two very solid low level tools that both really require digging into the mechanics AND are very stable themselves. The TF/Keras stuff on the other hand introduces confusion, skips over mechanics rather than teaching them, and are already out of date. Even worse, because they are out of date, things like links to documentation often fail. So the class feels dated even though it could have stuck with tools that do not really age.
by Ian P•
The YOLO week was fuzzy on some fundamental concepts around what a ConvNet output looks like when split into a grid and how bounding boxes are resolved when the shape extends beyond it's own cell. You can see a lot of confused students asking similar questions on this in the forum and most of the TAs seemed pretty unsure of their understanding of YOLO as well and hedged most of their responses with "This is the way I understand it, but I may be wrong". The YOLO homework and the Neural Style Transfer homework had a poor introduction to some very unintuitive Tensorflow concepts. It's got my curious about how the Fast.AI course made the switch from Tensorflow to PyTorch - I'd love to make that switch after these assignments.
by Jake B•
I liked the content of the lectures but this course seems unfinished. Several of the videos were poorly edited and contained portions which were clearly meant to have been edited out. More disappointing, the assignments did not build on each other or the lectures very well and some of the assignments required more understanding of TF than was provided trough the earlier assignments. Also, it seems like the assignments did not follow clear patterns which made them somewhat difficult at times.
IMHO, the Neural-style transfer material can be removed and replaced with more exercises in TF or Keras. I think that that would be more valuable and help people be better prepared to use either on their own.
by Trevor M•
As with the rest of this course, great lectures, terrible coursework.
Not saying it's not understandable, because in order to offer this course to hundreds of thousands of people, they have to automate the process, which means you can't have people checking your work to grade you, so you cannot have complicated projects (which is somewhat ironic given this is a series on deep learning and artificial intelligence). It would be fantastic, if it was structured similar to the Machine Learning course by Andrew Ng. Anyhow, if you want to gain a better understanding on these topics, you have to go out and build your own networks from scratch, and read the papers.
by Klas K•
The subject of this course is very interesting and I love that it is so bleeding edge. But the quality needs to be improved. Many videos repeat the same sentence again and again and seem to be very poorly edited. But even more annoying (and time consuming!) are errors and inconsistencies in the excercises. Most Notably in the grader (triplet_loss for week 4). Also, the changes to the quiz result displays are not helpfull: I was told that my answer was wrong for at least 2 questions where I was pretty sure and I would have liked to see that the answer I intended to give really is the wrong one. Maybe I just clicked the wrong checkbox unintentionally.
by Raj S•
Course material is good but lacks in the area on how to use tensorflow. Unfortunately, tensorflow documentation itself is terrible. Testing and grading systems are buggy and haven't been fixed for months (check the forums). Specifically, for the first programming assignment when one of assignment functions returns correct answer based the specifications provided in the code the grader grades it 0 and grades it correct when you violate the specifications and generate a wrong answer. In the quiz, portions of the questions are blank/missing and one has to totally guess the answer (obviously I was unlucky to guess both my questions wrong :( )
by Guenther M•
Had problems with assignments in Week4 : the strange thing: sometimes everything is explained in maybe even to much detail, then again there are cases where one feels fooled like when you have to use np.sum() instead of tf.reduce_sum() in the verify()-cell. By suggesting the use of tf.reduce_sum in the cell before you indirectly suggest its usage also later on! And this really doesn't add anything to your qualification, it is just annoying having to skim a lot of threads in the forum to finally find out the solution.
And more care should have been given to the videos: Andrew's repetitions of whole sentences should have been cut out!
by Nathan Y•
While as always Professor Ng was brilliant and informative, the final homework assignment (face recognition) was a disaster. Not only could we not load the weights because of corrupt files, but when that was resolved and the homework was submitted, the grader would only pass students who intentionally answered the Triplet section of the code wrong. What made this especially painful was the time it took to run the models. Tensorflow is not the easiest code to debug. One of the mentors from the course needs to monitor the forums closely - twice a day would not be too often. React and take charge when things start going badly.
by Abhishek R•
The course material is really good and Andrew explains things really well. However, the programming assignments cause a lot of problem owing to the performance of the grader where by correct answers are marked as incorrect/incomplete and the only option to submit the assignment OR get it graded correctly is to follow steps from the forums to make changes to the files to trick the grader in order to get it submitted. From the forums it seems like these problems have been there for over 2 years and still has not been fixed. Overall the programming assignments are really good and helps in understanding the implementations.
by Adi G•
I was taking this course because I hope to apply machine learning to biological problems. So while the first two weeks were great and super general, the third and particularly the fourth weeks were less relevant to me at this point but I had to struggle with them to get the certificate. Ideally, I would say that a way to improve this would be to create another week, dedicated either to biological problems or to something more general to all and let the students choose between the content of that week vs. the current 4th week. Another option is to make this a 3-week course and leave the 4th week entirely optional.
by Vincent S•
The video lessons gives very clear and understandable concepts but I didn't feel that the coding exercises will help me to write my owns. I could easy fill in the blanks and get the required grades but I have to admit that for the most of it I didn't understand what I was doing or what was happening in the part I didn't have to fill in. I have a reasonably strong mathematical background and barely no coding knowledge (a bit of Matlab and beginner python training). The whole deeplearning program was going relatively well up to the coding exercises in this course which jump a step too much for me.
Not as great as the previous three courses. The exercises here are much more challenging than before, but not always for the right reasons. A thorough primer on Tensorflow should be made mandatory in this course. A lot of the time you eventually manage to complete the exercises without really knowing what you are doing. The subject matter in this course is also more complex than in previous courses, so more attention needs to be put on really making students understand the fundamentals thorougly. Also, sometimes buggy or inexplicable grader output. Andrew Ng is still a great instructor though.