stable_ssl.optim#
The optim module provides custom optimizers and learning rate schedulers for self-supervised learning.
Optimizers#
|
Extends SGD in PyTorch with LARS scaling from the paper. |
Learning Rate Schedulers#
|
Apply cosine decay with multiple cycles for learning rate scheduling. |
|
Create a linear warmup learning rate scheduler. |
|
Combine linear warmup with cosine annealing decay. |
|
Combine linear warmup with cyclic cosine annealing. |
|
Combine linear warmup with a three-step learning rate annealing. |