A collection of various gradient descent algorithms implemented in Python from scratch
-
Updated
Feb 28, 2023 - Python
A collection of various gradient descent algorithms implemented in Python from scratch
[ICML 2021] The official PyTorch Implementations of Positive-Negative Momentum Optimizers.
NAG-GS: Nesterov Accelerated Gradients with Gauss-Siedel splitting
Overshoot: Taking advantage of future gradients in momentum-based stochastic optimization
Generic L-layer 'straight in Python' fully connected Neural Network implementation using numpy.
Add a description, image, and links to the momentum-optimization-algorithm topic page so that developers can more easily learn about it.
To associate your repository with the momentum-optimization-algorithm topic, visit your repo's landing page and select "manage topics."