Papers
arxiv:2512.08217

Correction of Decoupled Weight Decay

Published on Apr 13
Authors:

Abstract

Research challenges conventional weight decay assumptions in optimizers, proposing that decoupled weight decay should scale with the square of the learning rate for improved training stability and performance.

AI-generated summary

Decoupled weight decay, solely responsible for the performance advantage of AdamW over Adam, has long been set to proportional to learning rate γ without questioning. Some researchers have recently challenged such assumption and argued that decoupled weight decay should be set propto γ^2 instead based on orthogonality arguments at steady state. To the contrary, we find that eliminating the contribution of the perpendicular component of the update to the weight norm leads to little change to the training dynamics. Instead, we derive that decoupled weight decay propto γ^2 results in stable weight norm based on the simple assumption that updates become independent of the weights at steady state, regardless of the nature of the optimizer. Based on the same assumption, we derive and empirically verify that the Total Update Contribution (TUC) of a minibatch under the Scion optimizer is better characterized by the momentum-dependent effective learning rate whose optimal value transfers and we show that decoupled weight decay propto γ^2 leads to stable weight and gradient norms and allows us to better control the training dynamics and improve the model performance.

Community

Sign up or log in to comment

Get this paper in your agent:

hf papers read 2512.08217
Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2512.08217 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2512.08217 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2512.08217 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.