Overall it was great a course. A little bit weak in theory. I think for practical purposes whatever was sufficient. The detection of Question duplication was a very much cool model. I enjoy it a lot.
This is the third course of NLP Specialization. This was a great course and the instructors was amazing. I really learned and understand everything they thought like LSTM, GRU, Siamese Networks etc.
by Oleh S•
Well, very weak and oversimplified course. All videos are indecently short (from 1 to 4 minutes in majority) and do not give any intuition or understanding of the sequence models. GRU's and LSTM's are explained too briefly. Weird decision to choose Trax framework, it offers no reasonable advantages over Keras in this course. Also, I do not think that implementing the data generator function in the programming assignments gives anyone better intuition in understanding the core material. The last two assignments can be completed even without watching the lecture videos. To sum up, this course can be valuable as just the short intro to recurrent-based networks, but do not expect to deepen your knowledge.
Seriously, the weakest part from the first three courses, quickly prepared and lacks of quality. Totally disappointed.
by Tomasz S•
You can't learn anything with 3 minute videos, especially if 1 minute is wasted on repeating the previous video and saying what is going to be said and the last minute is wasted on saying what was said... That works for proper coursera lectures which last 15 minutes, and there are a few hours of material per week. And it's really low to have someone listed as a lecturer, when all he does is appear for 10 seconds to say "today I will teach/explain/show you", while he never actually does any teaching, explaining or showing - everything is left to the assistant...
by Roberto C•
Very bad class, a week of material can be done in around 2 jours, exercises are uninteresting, just write down what it is said in the comments (like: add 1 to the index, ...). Also, every exercise is similar to each other, write down the data generator, initialize the model using trax, train the model, test the model...
I think it is a lost opportunity, the majority of the course is just familiarise with the trax API and blindly apply neural network architectures using the API. The videos are very poor, not much information given and they repeat themselves a lot - you have the feeling that a lot of information is being repeated.
Instead of doing this course it is better to do the original Sequence Model course from deeplearning.ai.
by Jingying W•
I have previously completed Deep Learning and AI for Medicine specialization provided by deeplearning.ai and here are some of my thoughts about this course:
1. I like that the Python tutorials and assignment helps me learn the state of the art DL framework Trax and be more familiar with the working mechanism under the hood. If you read through the python scripts carefully and look up the linked documentation, they are really nice study resources.
2. This course improves my understanding of some models that I learned in other specialization courses such as Siamese model (e.g. hard negatives) for natural language.
3. Although the slides are made fancier than before, I'm not sure I enjoy the way it is explained. It's like reading the script and not really talking TO the students. Maybe some direct (live) notes on the slides helps students actually "dive in" ;) .
4. I watched your live discussion on YouTube on 29. July and feel the lectures talks very naturally there, but in the teaching video they behave in a quite unnatural way.
5. What I love about Andrew's DL specialization is that he also talks about his insights in a sincere (and personal) way, but in this course, it's just tooooo official.
Anyway, just my personal thoughts. I learned a lot in this course and hope the next course will be better ;)
by Adam S A•
I think the assignments should have gone deeper. We should have built and LSTM instead of just creating a model in trax in my opinion. I can always learn an API, but it would be nice to learn the nitty gritty of the model during the course. I also would prefer longer lecture videos that go into more detail.
by Han-Chung L•
Programming assignments not well constructed; need to restart notebooks to get the same exact output as expected. Not all expected outputs are printed out or have test functions similar to course 1. Some of the concepts are too quickly glanced over in lecture.
by Prashant G•
This course is very mechanical, expected more reasoning based course which incites logical thinking. Since most of the topics covered in this course are an active area of research, a discussion from "why or why not" point of view would have been more beneficial than just telling how to use a certain library like any other blog on the internet.
by Brian B•
Programming notebooks contain a lot of errors and poor writing is the explanations (in text cells and in comments in the code cells). The use of Trax instead of TensorFlow or PyTorch also reduces the usefulness of this course for picking up experience with frameworks I am most likely to use.
by Zhaohua F•
I think it should be better if we use TensorFlow 2.x or Pytorch instead of Trax, which is seldom used in other places. Also, mathematical derivation for why LSTM is better than simple RNN should be better put in the video.
by Marcin Z•
Course materials and lectures are fine, but exercises are boring - you have to implement data loaders every week. Moreover, the course uses Trax, like there were no other popular deep learning frameworks... so you are forced to learn yet another syntax. PyTorch FTW!
by Kabakov B•
Do not waste your time on this course, unless you just want a fancy certificate. It has the same problems as all previous courses in this specialization -- the theory is very superficial and the programming tasks are awful and separated from real ones. Take the Sequence models course from DL specialization instead, they even put links to it in this course for optional deeper education.
by Moritz F•
Videos are annoyingly short and provide little depth. Assignments are basically just typing / copy&paste exercises. The whole NLP specialization again starts with absolute basics of python and ml - I wouldn't find this bad if there weren't already enough foundational courses available on coursera. If I inscribe an NLP specialization I don't expect/want to do a python course. Almost everything shown here has already been covered in the deep learning specialization.
by Kweweli T•
The lectures are well planned--very short and to the point. The labs offer immense opportunity for practice, and assignment notebooks are well-written! Overall, the course is fantastic!
by Sreejith S•
The assignments use Trax library and I found it a bit difficult to understand and implement it. Would have been very much better if they had used Tensorflow 2x
by Rabin A•
The course is fine but if you've taken the course on Sequence Models by deeplearning.ai before then this won't add much to your knowledge except the Siamese Network. The main con of this course is the use of Trax instead of Keras of Tensorflow. I am not an opposer of Pytorch, but since deeplearning.ai has courses of Tensrflow, it would have been easier for many students to grasp the knowledge instead of learning a new framework again.
by Ala T•
Personally I'd rather prefer tensorflow over trax. I think I'm a bit lost between different tools, since different specializations in deeplearning.ai use different tools. Other than that I think it was a quite good short course.
by Mansi A•
The course is oversimplified and provides very little deeper knowledge into the techniques and networks used in NLP. Good course to get an overview, but if you want to have a deeper knowledge, you'll have to invest time yourself. Also, the usage of the Trax library was of no advantage. Keras or Tensorflow should have been used instead.
by Arun C•
The use of RNN using TRAX is a bit abstract. It should be explained further. For example, the time sequence is not clearly visible in training the model using TRAX.
by Ahmad O•
In the assignemnts, the grader doesn't return feedback till the last question, can't help me in debugging my code!
by Jorge A C•
As other course reviewers noted, this course did not help much to build the intuition underlying the methods used. The video lectures were short and the explanations, though concise, were convoluted and not clear at all. For a real understanding of what sequence models are capable of I recommend watching the lecture videos of Stanford CS224N. I
Learning about the Trax library and solving practical problems with the library was really interesting. Siamese network architecture was great thing to learn.
by Zoltan S•
This is an excellent course with some cutting edge material, and also an introduction to a new learning framework trax.
by Jerry C•
Great course! However the assignments are handholding too much step by step... I'd prefer the assignments to allow students to think more for themselves when implementing functions etc. (and only unhide hints or seek help on Slack when struggling for a long time)
by Swakkhar S•
First two courses were much better. It introduces trax, which is great. However, the materials of this course is already covered in the 5th course in the deep learning specialization. On the whole, great course, great efforts by the team.
by Li Z•
LSTM explanation is not very clear. Have to revisit some external links. The coding exercises are frustrating, even run properly step by step, got much glitches when submitting them. Time spent on fixing the submission issues is longer than taking the lessons.