Sequence Probabilities

video-placeholder
Loading...
シラバスを表示

学習するスキル

Word2vec, Parts-of-Speech Tagging, N-gram Language Models, Autocorrect

レビュー

4.7 (1,238 件の評価)

  • 5 stars
    81.09%
  • 4 stars
    13.73%
  • 3 stars
    3.71%
  • 2 stars
    0.48%
  • 1 star
    0.96%

AH

2020年9月28日

Filled StarFilled StarFilled StarFilled StarFilled Star

Very good course! helped me clearly learn about Autocorrect, edit distance, Markov chains, n grams, perplexity, backoff, interpolation, word embeddings, CBOW. This was very helpful!

SR

2021年8月4日

Filled StarFilled StarFilled StarFilled StarFilled Star

Another great course introducing the probabilistic modelling concepts and slowly getting to the direction of computing neural networks. One must learn in detail how embedding works.

レッスンから

Autocomplete and Language Models

Learn about how N-gram language models work by calculating sequence probabilities, then build your own autocomplete language model using a text corpus from Twitter!

講師

  • Placeholder

    Younes Bensouda Mourri

    Instructor

  • Placeholder

    Łukasz Kaiser

    Instructor

  • Placeholder

    Eddy Shyu

    Senior Curriculum Developer

コース一覧で検討

サインアップは無料です。今すぐサインアップして、パーソナライズされたお勧め、更新、サービスを利用しましょう。