Machine Learning and AI Foundations: Value Estimations

Go to class
Write Review

Free Online Course: Machine Learning and AI Foundations: Value Estimations provided by LinkedIn Learning is a comprehensive online course, which lasts for 1-2 hours worth of material. The course is taught in English and is free of charge. Upon completion of the course, you can receive an e-certificate from LinkedIn Learning. Machine Learning and AI Foundations: Value Estimations is taught by Adam Geitgey.

Overview
  • Discover how to solve value estimation problems with machine learning. Learn how to build a value estimation system that can estimate the value of a home.

Syllabus
  • Introduction

    • Welcome
    • What you should know
    • Using the exercise files
    • Set up the development environment
    1. What Is Machine Learning and Value Prediction?
    • What is machine learning?
    • Supervised machine learning for value prediction
    • Build a simple home value estimator
    • Find the best weights automatically
    • Cool uses of value prediction
    2. An Overview of Building a Machine Learning System
    • Introduction to NumPy, scikit-learn, and pandas
    • Think in vectors: How to work with large data sets efficiently
    • The basic workflow for training a supervised machine learning model
    • Gradient boosting: A versatile machine learning algorithm
    3. Training Data
    • Explore a home value data set
    • Standard conventions for naming training data
    • Decide how much data you need
    4. Features
    • Feature engineering
    • Choose the best features for home value prediction
    • Use as few features as possible: The curse of dimensionality
    5. Coding Our System
    • Prepare the features
    • Training vs. testing data
    • Train the value estimator
    • Measure accuracy with mean absolute error
    6. Improving Our System
    • Overfitting and underfitting
    • The brute force solution: Grid search
    • Feature selection
    7. Using the Estimator in a Real-World Program
    • Predict values for new data
    • Retrain the classifier with fresh data
    Conclusion
    • Wrap-up