pytorch_tao.plugins.Scheduler
- class pytorch_tao.plugins.Scheduler(torch_scheduler: _LRScheduler)
Simple adapter for PyTorch schedulers.
- Parameters
torch_scheduler – pytorch built-in scheduler.
Hooks Hook Point
Logic
ITERATION_COMPLETED
call scheduler.step and track the learning rate
import pytorch_tao as tao from pytorch_tao.plugin import Scheduler from torch.optim.lr_scheduler import StepLR model = ... optimizer = ... trainer = tao.Trainer() trainer.use(Scheduler(StepLR(optimizer, step_size=30)))
- __init__(torch_scheduler: _LRScheduler)
Methods
__init__(torch_scheduler)after_use()attach(engine)set_engine(engine)Attributes