You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
🌟 Implementation of Neural Networks from Scratch Using Python & Numpy 🌟
Uses Python 3.7.4
Optimizer Functions
Optimizer Functions help us update the parameters in the most efficient way possible. Optimizers update the weight parameters and bias terms to minimize the loss function to achieve global minimum.
Gradient Descent
W: weights | dW: weights gradient (obtained from loss function) | alpha: learning rate
Gradient Descent with Momentum
vdW: accumulator for weight parameter | beta: momentum term (dampening factor) | dJ/dW: weights gradient (obtained from loss function)