Skip to content

apachetechnology/deeplearning

Β 
Β 

Repository files navigation

Collection of my notes on neural network and practical examples

πŸ““ NN and DL πŸ““

Theory

Logistic Regression as a Neural Network

  1. Logistic Regression as a Neural Network
  2. Cost Function
  3. Gradient Descent
  4. Computational Graph
  5. Logistic Regression Gradient Descent
  6. Examples of Gradient Descent

Python and Vectorization

  1. Vectorization
  2. Vectorized Implementation of Logistic Regression
  3. Computation of Vectorized Logistic Regressions Gradient
  4. Broadcasting in Python
  5. Numpy Vectors
  6. Justification of Logistic Regression Cost Function

Shallow Neural Network

  1. Neural Networks Overview
  2. Vectorizing Implementation of Neural Network
  3. Activation Functions
  4. Derivatives of Activation Functions
  5. Gradient Descent for Neural Networks
  6. Random Initialization

Deep Neural Network

  1. Deep L-layer neural network
  2. Forward Propagation in a Deep Network
  3. Getting matrix dimensions right
  4. Why deep network?
  5. Building blocks of deep neural networks
  6. Forward and Backward Propagation
  7. Parameters vs Hyperparameters
  8. What does this have to do with the brain?

Machine Learning Application

  1. Train\Dev\Test sets
  2. Bias and Variance Tradeoff
  3. Basic Recipe for Machine Learning

Neural Network Regularization

  1. Regularization
  2. Dropout Regularization
  3. Other Regularization Methods

Optimization problem

  1. Normalizing Inputs
  2. Vanishing Exploding Gradients
  3. Weight Initialization for Deep Networks
  4. Numerical approximation of gradients
  5. Gradient Checking

Optimization Algorithms

  1. Mini Batch Gradient
  2. Exponentially weighted averages
  3. Bias correction in exponentially weighted averages
  4. Gradient descent with momentum
  5. Optimization Algorithms RMSprop
  6. Optimization Algorithms Adam
  7. Learning rate decay
  8. Local optima

Hyperparameter tuning

  1. Tuning Process

Batch Normalization

  1. Normalizing activations in a network
  2. Fitting Batch Norm into a neural network
  3. Why does Batch Norm work?
  4. Batch Norm at test time

Multi-class classification

  1. Softmax Regression
  2. Training a softmax classifier

Deeplearning Programming Framework

  1. Deeplearning programming framework
  2. TensorFlow

πŸ““ ML in Practice πŸ““

ML Strategy

  1. Why Machine Learning Strategy
  2. Orthogonalization

Goal Setting

  1. Single number evaluation metric
  2. Satisficing and Optimizing metric
  3. Train/dev/test distributions
  4. Size of dev/test data
  5. When to change dev/test sets and metrics

Comparing to human-level performance

  1. Why human-level performance?
  2. Avoidable bias
  3. Understanding human-level performance
  4. Surpassing human-level performance
  5. Improving your model performance

Error Analysis

  1. Error analysis
  2. Cleaning incorrectly labeled data
  3. Build system quickly then iterate

Mismatched training and dev/test set

  1. Training and testing on different distributions
  2. Bias and Variance with mismatched data distributions
  3. Addressing data mismatch

Learning from multiple tasks

  1. Transfer learning
  2. Multi-task learning

End-to-end deep learning

  1. What is end to end deep learning
  2. Whether to use end to end deep learning

πŸ““ Convolutional Neural Nets πŸ““

Theory

  1. Computer Vision
  2. Edge Detection
  3. Padding
  4. Strided Convolution
  5. Convolutions Over Volume
  6. One Layer Convolutional Network
  7. Simple Convolutional Network
  8. Pooling Layer
  9. CNN Example
  10. Why Convolutions are useful?

Deep convolutional models: case studies

  1. LeNet-5, ALexNet, VGG-16
  2. ResNets
  3. Networks in Networks and 1x1 Convolutions
  4. GoogLeNet: Inception Network

ConvNet in Practice

  1. Data Augmentation
  2. State of Computer Vision

Object Detectction

  1. Object Lcoalization
  2. Landmark Detection
  3. Object Detection
  4. Convolutional Implementation of Sliding Windows
  5. YOLO algorithm
  6. IoU
  7. Non max suppression
  8. Anchor Boxes
  9. R-CNN

Special applications: Face recognition & Neural style transfer

  1. Face Recognition
  2. neural style transfer
  3. Convolutional Networks in 1D and 3D

Sequence models

  1. Sequence Data Examples

RNN

  1. Notations
  2. Recurrent Neural Network Model
  3. Language model and sequence generation
  4. Sampling novel sequences
  5. GRU
  6. LSTM
  7. Bidirectional RNN

Word Embedding

  1. Word Representation
  2. Learning word embeddings
  3. Nueral Network Language model
  4. Word2Vec
  5. GloVe
  6. Debiasing word embedding

Sequence to Sequence model

  1. Basic Model
  2. Picking the most likely sentence
  3. Beam Search Algorithm
  4. Error Analysis in Beam Search
  5. Attention Model
  6. Speech Recognition

Generative Adversarial Nets

Applied GAN

Summation of Important Papers

TensorFlow Basic

NLP in TensorFlow

Sequence/time series and predictions

Deep neural network for time series

Real-world time series data

Implementations in Python and R

Implementations in R:

  1. Neural Network Classification
  2. Simple 2 hidden layer network with Keras

Implementations in Python:

  1. Building an image recognition algorithm using logistic regression with neural network mindset
  2. Building a 2 layer neural network for binary classification problem
  3. Building neural network utilities
  4. Building a 2 layer neural network and deep learning neural network from scratch
  5. Initialization
    • initialize_parameters_zeros
    • initialize_parameters_random
    • initialize_parameters_he
  6. Regularization
    • compute_cost_with_regularization
    • backward_propagation_with_regularization
    • forward_propagation_with_dropout
    • backward_propagation_with_dropout
  7. Gradient Checking
    • gradient_check
    • gradient_check_n
  8. Optimization
    • update_parameters_with_gd
    • random_mini_batches
    • initialize_velocity
    • update_parameters_with_momentum
    • initialize_adam
    • update_parameters_with_adam
  9. TensorFlow: Build neural network for multiclassification problem
  10. Building Convnet with numpy
  11. Building Convnet with TensorFlow
  12. Building ResNets with Keras
  13. AEDECOD and AETERM similarity with Universal Sentence Encoder with Transformer
  14. Objects detection using pre-trained Yolo weights
  15. Neural Style Transfer
  16. Face Recognition
  17. Character level language moodel - LSTM
  18. Generate Text - LSTM
  19. NLP tasks

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 94.1%
  • HTML 3.0%
  • Python 2.9%