OptimConfig#
- class stable_ssl.config.OptimConfig(optimizer: dict, scheduler: dict, epochs: int = 1000, max_steps: int = -1, accumulation_steps: int = 1, grad_max_norm: float | None = None)[source]#
Bases:
object
Configuration for the optimization parameters.
- Parameters:
optimizer (dict) – Configuration for the optimizer.
scheduler (dict) – Configuration for the learning rate scheduler.
epochs (int, optional) – Number of epochs to train the model. Default is 1000.
max_steps (int, optional) – Maximum number of steps to train the model. Default is -1. If negative, the models trains on the full dataset. If it is between 0 and 1, it represents the fraction of the dataset to train on.
accumulation_steps (int, optional) – Number of steps to accumulate gradients before updating the model. Default is 1.
grad_max_norm (float, optional) – Maximum norm of the gradients. If None, no clipping is applied. Default is None.