Implements the Adam optimization algorithm [4] .
More...
#include <adam.hpp>
|
| Adam (float lr=0.001f, float beta1=0.9f, float beta2=0.999f, float eps=1e-8f) |
|
virtual void | step (dl::TensorPtr &loss) override |
|
Implements the Adam optimization algorithm [4] .
Definition at line 10 of file adam.hpp.
◆ Adam()
dl::optim::Adam::Adam |
( |
float |
lr = 0.001f , |
|
|
float |
beta1 = 0.9f , |
|
|
float |
beta2 = 0.999f , |
|
|
float |
eps = 1e-8f |
|
) |
| |
|
inline |
Definition at line 18 of file adam.hpp.
19 :
dl::Optimizer(), lr(lr), beta1(beta1), beta2(beta2), eps(eps) {}
Defines an optimization strategy for a given set of Parameters.
◆ step()
The documentation for this class was generated from the following file: