LinearWarmup

Contents

LinearWarmup#

class stable_ssl.optim.LinearWarmup(optimizer, total_steps, start_factor=0.01, peak_step=0.1)[source]#

Bases:

Create a linear warmup learning rate scheduler.

This function creates a linear warmup scheduler that gradually increases the learning rate from a small value to the full learning rate over a specified number of steps.

Parameters:
  • optimizer (torch.optim.Optimizer) – The optimizer to schedule.

  • total_steps (int) – Total number of training steps.

  • start_factor (float, optional) – Initial learning rate factor. Defaults to 0.01.

  • peak_step (float, optional) – Step at which warmup peaks (as fraction of total_steps). Defaults to 0.1.

Returns:

Linear warmup scheduler.

Return type:

torch.optim.lr_scheduler.LinearLR

Example

>>> optimizer = torch.optim.Adam(model.parameters(), lr=0.001)
>>> scheduler = LinearWarmup(optimizer, total_steps=1000, start_factor=0.01)