Deep Learning and Neural Networks for Financial Engineering

Go to class
Write Review

Deep Learning and Neural Networks for Financial Engineering provided by edX is a comprehensive online course, which lasts for 7 weeks long, 4-6 hours a week. Deep Learning and Neural Networks for Financial Engineering is taught by Ken Perry. Upon completion of the course, you can receive an e-certificate from edX. The course is taught in Englishand is Paid Course. Visit the course page at edX for detailed price information.

Overview
  • Deep Learning ventures into territory associated with Artificial Intelligence. This course will demonstrate how neural networks can improve practice in various disciplines, with examples drawn primarily from financial engineering. Students will gain an understanding of deep learning techniques, including how alternate data sources such as images and text can advance practice within finance.

Syllabus
  • Week 0: Classical Machine Learning: Overview

    • Guided entry for students who have not taken the first course in the series

    • Notational conventions

    • Basic ideas: linear regression, classification

    • Recipe for Machine Learning

    Week 1: Introduction to Neural Networks and Deep Learning

    • Neural Networks Overview

    • Coding Neural Networks: Tensorflow, Keras

    • Practical Colab

    Week 2 : Convolutional Neural Networks

    • A neural network is a Universal Function Approximator

    • Convolutional Neural Networks (CNN): Introduction

    • CNN: Multiple input/output features

    • CNN: Space and time

    Week 3: Recurrent Neural Networks

    • Recurrent Neural Networks (RNN): Introduction

    • RNN Overview

    • Generating text with an RNN

    Week 4: Training Neural Networks

    • Back propagation

    • Vanishing and exploding gradients

    • Initializing and maintaining weights

    • Improving trainability

    • How big should my Neural Network be ?

    Week 5: Interpretation and Transfer Learning

    • Interpretation: Preview

    • Transfer Learning

    • Tensors, Matrix Gradients

    Week 6: Advanced Recurrent Architectures

    • Gradients of an RNN

    • RNN Gradients that vanish and explode

    • Residual connections

    • Neural Programming

    • LSTM

    • Attention: introduction

    Week 7: Advanced topics

    • Neural Language Processing (NLP)

    • Interpretation: what is going on inside a Neural Network

    • Attention

    • Adversarial examples

    • Final words