Skip to content

Latest commit

 

History

History
20 lines (16 loc) · 888 Bytes

File metadata and controls

20 lines (16 loc) · 888 Bytes

Machine learning implementations

This repo contains implementations of popular machine learning algorithms and some techniques that come along with them:

  • Logistic Regression with regularization
  • Multilayer Neural network with:
    • supported layer activations: ReLU, sigmoid, softmax
    • optimizers: gradient descent, GD with momentum, RMSProp, Adam
    • mini-batch training
    • adjustable number of hidden layers

A few IPython notebooks illustrate the results that these implementations can achieve and compares them with similar implementations in the Keras library.

Things to add (but haven't had the time to yet)

  • Dropout in the neural network
  • Batch normalization in the neural network
  • Tanh activation
  • ConvNet implementation
  • Some sequence models implementaions
  • More notebooks experimenting with models implemented using Keras and TensorFlow.