tf.contrib.opt

 

upper level

 

Modules

 

None

 

Classes

 

AddSignOptimizer:

Optimizer that implements the AddSign update.

DropStaleGradientOptimizer:

Wrapper optimizer that checks and drops stale gradient.

ElasticAverageCustomGetter:

Custom_getter class is used to do:

ElasticAverageOptimizer:

Wrapper optimizer that implements the Elastic Average SGD algorithm.

ExternalOptimizerInterface:

Base class for interfaces with external optimization algorithms.

LazyAdamOptimizer:

Variant of the Adam optimizer that handles sparse updates more efficiently.

ModelAverageCustomGetter:

Custom_getter class is used to do.

ModelAverageOptimizer:

Wrapper optimizer that implements the Model Average algorithm.

MovingAverageOptimizer:

Optimizer that computes a moving average of the variables.

MultitaskOptimizerWrapper:

Optimizer wrapper making all-zero gradients harmless.

NadamOptimizer:

Optimizer that implements the Nadam algorithm.

PowerSignOptimizer:

Optimizer that implements the PowerSign update.

ScipyOptimizerInterface:

Wrapper allowing scipy.optimize.minimize to operate a tf.Session.

VariableClippingOptimizer:

Wrapper optimizer that clips the norm of specified variables after update.

 

Functions

 

clip_gradients_by_global_norm(…):

Clips gradients of a multitask loss by their global norm.