TensorFlow Serving with Docker for Model Deployment

4.9
42件の評価
提供:
Coursera Project Network
4,175人がすでに登録済みです
このガイド付きプロジェクトでは、次のことを行います。

Train and export TensorFlow Models for text classification

Serve and deploy models with TensorFlow Serving and Docker

Perform model inference with gRPC and REST endpoints

Clock1.5 hours
Intermediate中級
Cloudダウンロード不要
Video分割画面ビデオ
Comment Dots英語
Laptopデスクトップのみ

This is a hands-on, guided project on deploying deep learning models using TensorFlow Serving with Docker. In this 1.5 hour long project, you will train and export TensorFlow models for text classification, learn how to deploy models with TF Serving and Docker in 90 seconds, and build simple gRPC and REST-based clients in Python for model inference. With the worldwide adoption of machine learning and AI by organizations, it is becoming increasingly important for data scientists and machine learning engineers to know how to deploy models to production. While DevOps groups are fantastic at scaling applications, they are not the experts in ML ecosystems such as TensorFlow and PyTorch. This guided project gives learners a solid, real-world foundation of pushing your TensorFlow models from development to production in no time! Prerequisites: In order to successfully complete this project, you should be familiar with Python, and have prior experience with building models with Keras or TensorFlow. Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.

あなたが開発するスキル

  • Deep Learning
  • Docker
  • TensorFlow Serving
  • Tensorflow
  • model deployment

ステップバイステップで学習します

ワークエリアを使用した分割画面で再生するビデオでは、講師がこれらの手順を説明します。

  1. Introduction and Demo Deployment

  2. Load and Preprocess the Amazon Fine Foods Review Data

  3. Build Text Classification Model using Keras and TensorFlow Hub

  4. Define Training Procedure

  5. Train and Export Model as Protobuf

  6. Test Model

  7. TensorFlow Serving with Docker

  8. Setup a REST Client to Perform Model Predictions

  9. Setup a gRPC Client to Perform Model Predictions

  10. Versioning with TensorFlow Serving

ガイド付きプロジェクトの仕組み

ワークスペースは、ブラウザに完全にロードされたクラウドデスクトップですので、ダウンロードは不要です

分割画面のビデオで、講師が手順ごとにガイドします

レビュー

TENSORFLOW SERVING WITH DOCKER FOR MODEL DEPLOYMENT からの人気レビュー

すべてのレビューを見る

よくある質問

よくある質問

さらに質問がある場合は、受講者ヘルプセンターにアクセスしてください。