Data Pipelines with TensorFlow Data Services

Go to class
Write Review

Free Online Course: Data Pipelines with TensorFlow Data Services provided by Coursera is a comprehensive online course, which lasts for 4 weeks long, 16 hours worth of material. The course is taught in English and is free of charge. Upon completion of the course, you can receive an e-certificate from Coursera. Data Pipelines with TensorFlow Data Services is taught by Laurence Moroney.

Overview
  • Bringing a machine learning model into the real world involves a lot more than just modeling. This Specialization will teach you how to navigate various deployment scenarios and use data more effectively to train your model.

    In this third course, you will:
    - Perform streamlined ETL tasks using TensorFlow Data Services
    - Load different datasets and custom feature vectors using TensorFlow Hub and TensorFlow Data Services APIs
    - Create and use pre-built pipelines for generating highly reproducible I/O pipelines for any dataset
    - Optimize data pipelines that become a bottleneck in the training process
    - Publish your own datasets to the TensorFlow Hub library and share standardized data with researchers and developers around the world


    This Specialization builds upon our TensorFlow in Practice Specialization. If you are new to TensorFlow, we recommend that you take the TensorFlow in Practice Specialization first. To develop a deeper, foundational understanding of how neural networks work, we recommend that you take the Deep Learning Specialization.

Syllabus
    • Data Pipelines with TensorFlow Data Services
      • This week, you will be able to perform efficient ETL tasks using Tensorflow Data Services APIs
    • Splits and Slices API for Datasets in TF
      • In this week, you will construct train/validation/test splits of any dataset - either custom or present in TensorFlow hub dataset library - using Splits API
    • Exporting Your Data into the Training Pipeline
      • This week you will extend your knowledge of data pipelines
    • Performance
      • You'll learn how to handle your data input to avoid bottlenecks, race conditions and more!