Chevron Left
シーケンスモデル に戻る

deeplearning.ai による シーケンスモデル の受講者のレビューおよびフィードバック

4.8
28,102件の評価

コースについて

In the fifth course of the Deep Learning Specialization, you will become familiar with sequence models and their exciting applications such as speech recognition, music synthesis, chatbots, machine translation, natural language processing (NLP), and more. By the end, you will be able to build and train Recurrent Neural Networks (RNNs) and commonly-used variants such as GRUs and LSTMs; apply RNNs to Character-level Language Modeling; gain experience with natural language processing and Word Embeddings; and use HuggingFace tokenizers and transformer models to solve different NLP tasks such as NER and Question Answering. The Deep Learning Specialization is a foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to take the definitive step in the world of AI by helping you gain the knowledge and skills to level up your career....

人気のレビュー

JY

2018年10月29日

The lectures covers lots of SOTA deep learning algorithms and the lectures are well-designed and easy to understand. The programming assignment is really good to enhance the understanding of lectures.

AM

2019年6月30日

The course is very good and has taught me the all the important concepts required to build a sequence model. The assignments are also very neatly and precisely designed for the real world application.

フィルター:

シーケンスモデル: 3101 - 3125 / 3,369 レビュー

by Enzo D

2018年4月13日

Very usefull!!!

by Mohamad T

2022年1月27日

too long- good

by guggilla s

2019年3月9日

easy to learn

by Sayon S

2018年6月11日

A bit cryptic

by Phoenix A

2018年3月28日

Good material

by yanhang

2019年7月17日

very useful

by Paul A

2019年2月24日

a bit hard!

by Sonia D

2019年1月31日

Very Useful

by BILLA N R

2020年4月18日

productive

by Roberto J

2018年2月14日

Thank you.

by Ariel H

2018年10月13日

Excellent

by 36 - O S

2020年6月1日

Average

by Dave

2020年7月11日

good

by teaching m

2020年4月5日

nice

by VIGNESHKUMAR R

2019年10月24日

good

by Shashank V M

2019年9月16日

Good

by Yashwanth M

2019年7月23日

Good

by Rahila T

2018年11月15日

Good

by savinay s

2018年4月9日

good

by krishna m s g

2018年3月22日

g

o

o

by Aaradhya S

2020年4月25日

..

by Natalia O

2019年10月4日

in comparison to the previous courses from this sequence, this one is even less structured - ptobably this is because even broader knowledge is tried to be shown in only 3 weeks, but i feel like a lot is skipped between videos (which are ok) and the tasks - in many assignment tasks in this course it is not very well explained what is meant to be done - i mean this especially in case of Keras objects. In many cases it is quite unclear how those classes are supposed to be handled in the context of our task. There are some hints but those are mostly links to documentation (btw, some of the links are no longer up to date), but it is often not too well explained which properties those objects have, what one can do etc. so one ends up with trying using those objects in different configuarations, then googling around, looking on the course forum for the right answer but it is very difficult to derive it. There should be more precise instructions regarding handling Keras objects - the examples in the documentation and in blogs are often much simpler than those from assignments so one ends up not knowing what is going on. In summary - there is a big jump and a big gap between the intuitions in videos (which btw are much more fuzzy than those in first cources in the specialization, the intuitions get more and more superfluous as one doesnt go into detail) and what is being done in the assignments. One thing i really liked about hte previous assignments was that when writing the code one could really know very well what is going on. And this is no longer the case in this course...

by Mark S

2019年10月9日

As we head to the last course in the specialization (and the last two courses are the ones that interested me), we have error after error in the assignments, including problems with the kernel that are not obvious until you've struggled with incoherent stack trace output for a while.

Searching the disorganised discussion centre for the course/week in question you can find that these errors affect everyone and go back for a couple of years, never having been fixed. The mentors there help explain, but mentors cannot edit to fix the code as they do not have permission, and the course supervisors have long since disappeared. So you have to submit incorrect code to pass, then fix the code for your personal private code store - as the fixed code generates the correct numerical answers that unfortunately do not match the numerical answers that the grader requires to pass you!

It feels like, in the hurry to get the full specialization out, the final courses go downhill in terms of care & attention in the rush. Then afterwards, all of the errors and badly designed code in the assignments cause many unexpected headaches, nothing to do with DL, and were never fixed or maintained afterwards by the course supervisors.

In the end, the delays caused to me in the final (two) course(s) added at least one extra monthly payment on to my subscription. Overall I can't complain, the specialization is good. But feels abandoned by the lecturer & assistant lecturers since early 2018

by Stephen D

2018年5月13日

It's helpful to have this course since there aren't enough beginner-oriented courses on these topics, especially ones that also get into actual equations like he does. However I think he doesn't provide enough explanations of complicated topics like GRU's and LSTM's. There are lots of confusing aspects of both such technologies, and he could afford to spend even more time in explanation than he does.

EDIT: I am now on week 2. This course feels rushed and he doesn't take the time to clarify confusing issues - for example, when he first introduces how to learn word embeddings he calls the neural network you use a "language model" even though the network bears no resemblance to the language model we learned in week 1. This really confused me and he doesn't address this point. Also, he variously describes the embeddings as the "input" and the "parameters" of this neural network, even though those are clearly two different things. There are more issues where that came from.

Unlike all of his previous courses, I've found myself needing to go to Wikipedia and Google to try to fill in various holes in the presentations here.

Also, there is essentially no help on the forums. That isn't the reason for my low rating, since for a cheap course I didn't expect much. Still, it would have been nice if they had tried to do a little bit more there.

by Ramon R

2018年5月8日

Unlike the other courses which Andrew Ng provides, this one contains many spelling mistakes in the programming assignment, the programming assignments are less structured and understandable (missing or wrong information in nearly every assignment) and an introduction to keras is missing. I found it great that the keras framework is an important component of this course, but unlike the tensorflow introduction it is missing here. It is frustrating, when you might have the right functions but no information how to input and determine the correct variables for the functions. Anyway I found the outline of the course very good as it gives a good overview of many methods and how they work. To my mind the consistency of the assignments and also the story telling needs to be improved to reach the level of other courses where Andrew is involved. It appeared more chaotic and the complexity of the algorithms is overwhelming, so a better introduction to how they work, might be appealing. In the end, I worked through it and I gained a basic understanding of keras and RNN algorithms. So it was definitely worth it.