This project implements a neural network entirely from scratch using JavaScript, with the goal of understanding the mathematics behind neural networks, particularly backpropagation.
- Feedforward Propagation: Implemented the forward pass, understanding data flow through layers.
- Backpropagation: Developed a thorough understanding of the algorithm, including error calculation and gradient computation.
- Gradient Descent: Implemented stochastic gradient descent for parameter optimization.
- Activation Functions: Explored different functions and their impact on learning.
- Matrix Operations: Utilized extensively, reinforcing linear algebra understanding. Used some concepts from my MATH 214 class!
Network
class initializes the neural network- Methods for feedforward propagation and backpropagation
- Stochastic Gradient Descent (SGD) for training
- Helper functions for mathematical operations