Running
on
T4
349
๐ฅ
MARS results look great! Just started a training run with cmars
, will report back
Woah, looks like a good boost across most results. Been using torch.optim.adamw
for months. Will try out a training run today with timm.optim.cadamw
timm
release, v 1.0.12, with a focus on optimizers. The optimizer factory has been refactored, there's now a timm.optim.list_optimizers()
and new way to register optimizers and their attributes. As always you can use an timm
optimizer like a torch
one, just replace torch.optim
with timm.optim
adfactorbv
adopt
/ adoptw
(decoupled decay)mars
laprop
c
as well as cadamw
, cnadamw
, csgdw
, clamb
, crmsproptf