Skip to content

Latest commit

 

History

History
45 lines (38 loc) · 1.12 KB

numpy_ml.neural_nets.optimizers.rst

File metadata and controls

45 lines (38 loc) · 1.12 KB

Optimizers

Popular gradient-based strategies for optimizing parameters in neural networks.

For a discussion regarding the generalization performance of the solutions found via different optimization strategies, see:

[1]Wilson et al. (2017) "The marginal value of adaptive gradient methods in machine learning", Proceedings of the 31st Conference on Neural Information Processing Systemshttps://arxiv.org/pdf/1705.08292.pdf

OptimizerBase

.. autoclass:: numpy_ml.neural_nets.optimizers.optimizers.OptimizerBase :members: :undoc-members: :show-inheritance: 

SGD

.. autoclass:: numpy_ml.neural_nets.optimizers.SGD :members: :undoc-members: :show-inheritance: 

AdaGrad

.. autoclass:: numpy_ml.neural_nets.optimizers.AdaGrad :members: :undoc-members: :show-inheritance: 

Adam

.. autoclass:: numpy_ml.neural_nets.optimizers.Adam :members: :undoc-members: :show-inheritance: 

RMSProp

.. autoclass:: numpy_ml.neural_nets.optimizers.RMSProp :members: :undoc-members: :show-inheritance: 
close