Learning Rate Decay Calculator
Calculate the learning rate at each epoch or step using different decay strategies, helping you optimize training in deep learning models.

What this calculator is doing
This tool computes learning rates across training steps or epochs using a specified decay strategy. Learning rate
schedules are critical for training deep neural networks effectively. Supported decay types:
Exponential Decay:
\[
\text{LR}(t) = LR_0 \cdot \gamma^t
\]
Step Decay:
\[
\text{LR}(t) = LR_0 \cdot \gamma^{\left\lfloor \frac{t}{\text{decay step}} \right\rfloor}
\]
Polynomial Decay:
\[
\text{LR}(t) = (LR_0 - LR_{\text{min}}) \cdot \left(1 - \frac{t}{T}\right)^{\text{power}} + LR_{\text{min}}
\]
Cosine Decay:
\[
\text{LR}(t) = LR_{\text{min}} + \frac{1}{2}(LR_0 - LR_{\text{min}})\left(1 + \cos\left(\pi \cdot
\frac{t}{T}\right)\right)
\]
Linear Decay:
\[
\text{LR}(t) = LR_0 - \left(\frac{LR_0 - LR_{\text{min}}}{T}\right) \cdot t
\]
Warmup (optional): Increases LR linearly from 0 (or a defined warmup LR) to LR₀ over a few steps, before
applying the decay.
This tool gives visibility into how your learning rate will evolve across training and helps avoid instability
from improper decay tuning.
Disclaimer: These calculators are provided for informational purposes only. Always verify your designs against relevant engineering standards and consult a qualified professional. We do not take responsibility for any errors or damages resulting from the use of these calculations.