Need help with keras-adabound?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

titu1994
130 Stars 34 Forks MIT License 25 Commits 5 Opened issues

Description

Keras implementation of AdaBound

Services available

!
?

Need anything else?

Contributors list

# 9,119
Python
Keras
Tensorf...
HTML
25 commits

AdaBound for Keras

Keras port of AdaBound Optimizer for PyTorch, from the paper Adaptive Gradient Methods with Dynamic Bound of Learning Rate.

Usage

Add the

adabound.py
script to your project, and import it. Can be a dropin replacement for
Adam
Optimizer.

Also supports

AMSBound
variant of the above, equivalent to
AMSGrad
from Adam.
from adabound import AdaBound

optm = AdaBound(lr=1e-03, final_lr=0.1, gamma=1e-03, weight_decay=0., amsbound=False)

Results

With a wide ResNet 34 and horizontal flips data augmentation, and 100 epochs of training with batchsize 128, it hits 92.16% (called v1).

Weights are available inside the Releases tab

NOTE

  • The smaller ResNet 20 models have been removed as they did not perform as expected and were depending on a flaw during the initial implementation. The ResNet 32 shows the actual performance of this optimizer.

With a small ResNet 20 and width + height data + horizontal flips data augmentation, and 100 epochs of training with batchsize 1024, it hits 89.5% (called v1).

On a small ResNet 20 with only width and height data augmentations, with batchsize 1024 trained for 100 epochs, the model gets close to 86% on the test set (called v3 below).

Train Set Accuracy

Train Set Loss

Test Set Accuracy

Test Set Loss

Requirements

  • Keras 2.2.4+ & Tensorflow 1.12+ (Only supports TF backend for now).
  • Numpy

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.