Course can be found in Coursera
Quiz and answers are collected in my blog SSQ
Course can be found in Coursera
Slides and more details about this course can be found in my Github SSQ
- Week 1:
- Understand the major trends driving the rise of deep learning.
- Be able to explain how deep learning is applied to supervised learning.
- Understand what are the major categories of models (such as CNNs and RNNs), and when they should be applied.
- Be able to recognize the basics of when deep learning will (or will not) work well.
- Week 2:
- Build a logistic regression model, structured as a shallow neural network
- Implement the main steps of an ML algorithm, including making predictions, derivative computation, and gradient descent.
- Implement computationally efficient, highly vectorized, versions of models.
- Understand how to compute derivatives for logistic regression, using a backpropagation mindset.
- Become familiar with Python and Numpy
- Work with iPython Notebooks
- Be able to implement vectorization across multiple training examples
- Python Basics with Numpy (optional assignment)
- Logistic Regression with a Neural Network mindset
- Week 3:
- Understand hidden units and hidden layers
- Be able to apply a variety of activation functions in a neural network.
- Build your first forward and backward propagation with a hidden layer
- Apply random initialization to your neural network
- Become fluent with Deep Learning notations and Neural Network Representations
- Build and train a neural network with one hidden layer.
- Build a 2-class classification complete neural network with a hidden layer
- Week 4:
- See deep neural networks as successive blocks put one after each other
- Build and train a deep L-layer Neural Network
- Analyze matrix and vector dimensions to check neural network implementations.
- Understand how to use a cache to pass information from forward propagation to back propagation.
- Understand the role of hyperparameters in deep learning
- Building Deep Neural Network: Step by Step
- Deep Neural Network for Image Classification: Application
Course can be found in Coursera
Slides and more details about this course can be found in my Github SSQ
- Week 1 Practical aspects of Deep Learning
- Recall that different types of initializations lead to different results
- Recognize the importance of initialization in complex neural networks.
- Recognize the difference between train/dev/test sets
- Diagnose the bias and variance issues in your model
- Learn when and how to use regularization methods such as dropout or L2 regularization.
- Understand experimental issues in deep learning such as Vanishing or Exploding gradients and learn how to deal with them
- Use gradient checking to verify the correctness of your backpropagation implementation
- Initialization
- Regularization
- Gradient Checking
- Week 2 Optimization algorithms
- Remember different optimization methods such as (Stochastic) Gradient Descent, Momentum, RMSProp and Adam
- Use random minibatches to accelerate the convergence and improve the optimization
- Know the benefits of learning rate decay and apply it to your optimization
- Optimization
- Week 3 Hyperparameter tuning, Batch Normalization and Programming Frameworks
- Master the process of hyperparameter tuning
- Master the process of batch Normalization
- Tensorflow
Course can be found in Coursera
Slides and more details about this course can be found in my Github SSQ
- Week 1 ML Strategy (1)
- Understand why Machine Learning strategy is important
- Apply satisficing and optimizing metrics to set up your goal for ML projects
- Choose a correct train/dev/test split of your dataset
- Understand how to define human-level performance
- Use human-level perform to define your key priorities in ML projects
- Take the correct ML Strategic decision based on observations of performances and dataset
- Week 2 ML Strategy (2)
- Understand what multi-task learning and transfer learning are
- Recognize bias, variance and data-mismatch by looking at the performances of your algorithm on train/dev/test sets
Course can be found in Coursera
Slides and more details about this course can be found in my Github SSQ
-
Week 1 Foundations of Convolutional Neural Networks
- Understand the convolution operation
- Understand the pooling operation
- Remember the vocabulary used in convolutional neural network (padding, stride, filter, ...)
- Build a convolutional neural network for image multi-class classification
- Convolutional Model: step by step
- Convolutional Model: application
-
Week 2 Deep convolutional models: case studies
- Understand multiple foundational papers of convolutional neural networks
- Analyze the dimensionality reduction of a volume in a very deep network
- Understand and Implement a Residual network
- Build a deep neural network using Keras
- Implement a skip-connection in your network
- Clone a repository from github and use transfer learning
- Keras - Tutorial - Happy House v2
- Residual Networks - v2
-
Week 3 Object detection
- Understand the challenges of Object Localization, Object Detection and Landmark Finding
- Understand and implement non-max suppression
- Understand and implement intersection over union
- Understand how we label a dataset for an object detection application
- Remember the vocabulary of object detection (landmark, anchor, bounding box, grid, ...)
- Car detection with YOLOv2
-
Week 4 Special applications: Face recognition & Neural style transfer
- Understand One Shot Learning, Siamese Network, Triplet Loss
- Understand Content Cost Function, Style Cost Function, 1D and 3D Generalizations
- Deep Learning & Art: Neural Style Transfer
- Face Recognition for the Happy House