LinearWarmup#
- class stable_ssl.optim.LinearWarmup(optimizer, total_steps, start_factor=0.01, peak_step=0.1)[source]#
Bases:
Create a linear warmup learning rate scheduler.
This function creates a linear warmup scheduler that gradually increases the learning rate from a small value to the full learning rate over a specified number of steps.
- Parameters:
- Returns:
Linear warmup scheduler.
- Return type:
torch.optim.lr_scheduler.LinearLR
Example
>>> optimizer = torch.optim.Adam(model.parameters(), lr=0.001) >>> scheduler = LinearWarmup(optimizer, total_steps=1000, start_factor=0.01)