tensorflow-ops-0.1.0.0: Friendly layer around TensorFlow bindings.

Safe HaskellNone
LanguageHaskell2010

TensorFlow.Minimize

Synopsis

Documentation

type Minimizer a = forall m. MonadBuild m => [Variable a] -> [Tensor Value a] -> m ControlNode Source #

Functions that minimize a loss w.r.t. a set of Variables.

Generally only performs one step of an iterative algorithm.

Minimizers are defined as a function of the gradients instead of the loss so that users can apply transformations to the gradients.

minimizeWith Source #

Arguments

:: (MonadBuild m, GradientCompatible a) 
=> Minimizer a 
-> Tensor v a

Loss.

-> [Variable a]

Parameters of the loss function.

-> m ControlNode 

Convenience wrapper around gradients and a Minimizer.

gradientDescent Source #

Arguments

:: GradientCompatible a 
=> a

Learning rate.

-> Minimizer a 

Perform one step of the gradient descent algorithm.

data AdamConfig Source #

Instances

Default AdamConfig Source # 

Methods

def :: AdamConfig

adam :: Minimizer Float Source #

Perform one step of the adam algorithm.

See https://arxiv.org/abs/1412.6980.

NOTE: Currently requires all Variables to have an initializedValue.