Backpropagation
Definition
Loss Function
Loss derivative (gradient)
# Backward pass
error = compute_loss_gradient(y, output)
for layer in reversed(self.layers):
error = layer.backward(error, learning_rate, self.optimizer, epoch + 1)Mean Squared Error (MSE)
Cross Entropy
Optimization Algorithms
SGD
RMSProp
Adam
References
Last updated