Using Learning Rate Schedules for Deep Learning Models in Python with Keras

Last Updated on August 27, 2020

Training a neural network or large deep learning model is a difficult optimization task.

The classical algorithm to train neural networks is called stochastic gradient descent. It has been well established that you can achieve increased performance and faster training on some problems by using a learning rate that changes during training.

In this post you will discover how you can use different learning rate schedules for your neural network models in Python using the Keras deep learning library.

After reading this post you will know:

  • How to configure and evaluate a time-based learning rate schedule.
  • How to configure and evaluate a drop-based learning rate schedule.

Kick-start your project with my new book Deep Learning With Python, including step-by-step tutorials and the Python source code files for all examples.

Let’s get started.

  • Update Mar/2017: Updated for Keras 2.0.2, TensorFlow 1.0.1 and Theano 0.9.0.
  • Update Sep/2019: Updated for Keras 2.2.5 API.
Using Learning Rate Schedules for Deep Learning Models in Python with Keras

Using Learning Rate Schedules for Deep
To finish reading, please visit source site