stable_ssl.optim

stable_ssl.optim#

The optim module provides custom optimizers and learning rate schedulers for self-supervised learning.

Optimizers#

LARS(params[, lr, momentum, dampening, ...])

Extends SGD in PyTorch with LARS scaling from the paper.

Learning Rate Schedulers#

CosineDecayer(total_steps[, n_cycles, gamma])

Apply cosine decay with multiple cycles for learning rate scheduling.

LinearWarmup(optimizer, total_steps[, ...])

Create a linear warmup learning rate scheduler.

LinearWarmupCosineAnnealing(optimizer, ...)

Combine linear warmup with cosine annealing decay.

LinearWarmupCyclicAnnealing(optimizer, ...)

Combine linear warmup with cyclic cosine annealing.

LinearWarmupThreeStepsAnnealing(optimizer, ...)

Combine linear warmup with a three-step learning rate annealing.