Skip to content

Latest commit

 

History

History
3 lines (2 loc) · 2.3 KB

README.md

File metadata and controls

3 lines (2 loc) · 2.3 KB

Beginning in 1958 at the Cornell Aeronautical Laboratory, the fundamentals of Neural Networks and consequential anointing advancements to Machine Learning were invented by American psychologist Frank Rosenblatt. After nearly a century, the field has grown amass in abstraction, complexity and development, advancing the discipline into the highly practical and reliable instruments of data processing that have found massive use in day-to-day use. Though, with said abstraction and introduction of many Machine Learning libraries in the public domain, it is not abnormal for one to utilize this knowledge without remotely understanding the interwoven harmony of algorithmic, mathematical and biological beauty the precepts of Neural Networks function through. This project aims to reexplore the origins of this highly important Machine Learning subfield by creating a program that is able to build, train and validate data, all from scratch using the Python programming language. This includes the implementation of forward and backpropagation, two algorithms in demand of adept understanding of the mathematics discipline, specifically in the network’s Linearly Algebraic nature of its input, output and hidden layers and the Calculus-revolved gradient vector of partial derivatives in the latter algorithm, through which the network will be trained. With the inclusion of modern optimizing algorithms, the network may be improved into a generalized tool of categorical analysis via the installment of the Categorical Cross-Entropy Loss function. The interlacing of all aforementioned elements then yields a model capable of producing dependable results. Furthermore, to highlight the staple of the field’s technological advancements, the project also sees to draw a cross-comparison between the now-widely used Rectified Linear Unit (ReLU) and the archaic Sigmoid activation functions. This aims to highlight their training efficacy and difficulty through both, a mathematical and empirical lens. All in all, this project is to be deemed as an academic resource touching upon the hidden knowledge and principles of Neural Networks and the progress of the field which have led to its massive success and use in today’s world.

Primary resource: Kinsley, Harrison, and Daniel Kukieła. Neural Networks from Scratch in Python. Harrison Kinsley, 2020.