libdl  0.0.1
Simple yet powerful deep learning
Loading...
Searching...
No Matches
dl::optim::Adam Class Reference

Implements the Adam optimization algorithm [4] . More...

#include <adam.hpp>

Inheritance diagram for dl::optim::Adam:
Collaboration diagram for dl::optim::Adam:

Public Member Functions

 Adam (float lr=0.001f, float beta1=0.9f, float beta2=0.999f, float eps=1e-8f)
 
virtual void step (dl::TensorPtr &loss) override
 

Detailed Description

Implements the Adam optimization algorithm [4] .

Definition at line 10 of file adam.hpp.

Constructor & Destructor Documentation

◆ Adam()

dl::optim::Adam::Adam ( float  lr = 0.001f,
float  beta1 = 0.9f,
float  beta2 = 0.999f,
float  eps = 1e-8f 
)
inline

Definition at line 18 of file adam.hpp.

19 : dl::Optimizer(), lr(lr), beta1(beta1), beta2(beta2), eps(eps) {}
Defines an optimization strategy for a given set of Parameters.
Definition optimizer.hpp:11

Member Function Documentation

◆ step()

virtual void dl::optim::Adam::step ( dl::TensorPtr loss)
inlineoverridevirtual

Implements dl::Optimizer.

Definition at line 21 of file adam.hpp.

21{ throw std::runtime_error("Not yet implemented"); }

The documentation for this class was generated from the following file: