Machine Learning
Jacob Campanile 10/21/2025

As machine learning started blowing up, I really wanted to fully understand how it actually worked. I decided to make a neural network with backpropagation from scratch in python, first without numpy and another with numpy. I did this because I wanted to do the calculations from scratch first. My goal for the project was to train my neural network on mnist with a testing accuracy of 90% or better.

It took about a month to complete my goal. I used 3Blue1Brown's series on neural networks to learn how it all worked. I also ended up implimenting the ADAM optimizer along with the network to get the final result of 92.5% testing accuracy in about 9 seconds on my M1 Macbook Air.

All of the code is on my github page. The network is simple and modular; this allows the project to be built upon more. I also have the training, testing data and the code I used to train the network with mnist there too.