Add torch-optimizer to allow to try with different optimizers?

#7
by GottUndSo - opened

I would like to try to change optim_g = torch.optim.AdamW( and optim_d = torch.optim.AdamW( in train_nsf_sim_cache_sid_load_pretrain.py to something like DiffGrad for experimenting, but I always get AttributeError: module 'torch.optim' has no attribute 'DiffGrad'.

Would it be possible to add everything needed into the RVC-beta.7z? Or can someone tell me how to make it work?

You just need to pip install torch_optimizerand add import torch_optimizer as optim to the script

Wasnt working for me. But after the pip install I just copied the diffgrad.py files into torch.optimizers and added it in that ini.py.

I tried to add import torch_optimizer as optim but it didn't work, it show this ModuleNotFoundError: No module named 'torch_optimizer' even when I have downloaded the torch optimizer

Wasnt working for me. But after the pip install I just copied the diffgrad.py files into torch.optimizers and added it in that ini.py.

how exactly did you do that

In your new installed package you find a file called diffgrad.py. You copy that into the folder optimizers thats in the folder torch. In that Folder that is also a ini.py.
I added that line:
grafik.png

thanks, it is working now

Sign up or log in to comment