Adam
class boulderopal.stochastic.Adam(learning_rate=0.01)
Adaptive moment estimation (Adam) optimizer for stochastic optimization.
For more detail on Adam see Adam on Wikipedia.
Parameters
learning_rate (float , optional) – The learning rate for the Adam optimizer. If set, must be positive. Defaults to 0.01.
SEE ALSO
boulderopal.run_stochastic_optimization
: Perform gradient-based stochastic optimization of generic real-valued functions.