Nick Martin glenn-jocher commited on
Commit
a88a814
·
unverified ·
1 Parent(s): 245d645

Copy wandb param dict before training to avoid overwrites (#7317)

Browse files

* Copy wandb param dict before training to avoid overwrites.

Copy the hyperparameter dict retrieved from wandb configuration before passing it to `train()`. Training overwrites parameters in the dictionary (eg scaling obj/box/cls gains), which causes the values reported in wandb to not match the input values. This is confusing as it makes it hard to reproduce a run, and also throws off wandb's Bayesian sweep algorithm.

* Cleanup

Co-authored-by: Glenn Jocher <[email protected]>

Files changed (1) hide show
  1. utils/loggers/wandb/sweep.py +2 -2
utils/loggers/wandb/sweep.py CHANGED
@@ -16,8 +16,8 @@ from utils.torch_utils import select_device
16
 
17
  def sweep():
18
  wandb.init()
19
- # Get hyp dict from sweep agent
20
- hyp_dict = vars(wandb.config).get("_items")
21
 
22
  # Workaround: get necessary opt args
23
  opt = parse_opt(known=True)
 
16
 
17
  def sweep():
18
  wandb.init()
19
+ # Get hyp dict from sweep agent. Copy because train() modifies parameters which confused wandb.
20
+ hyp_dict = vars(wandb.config).get("_items").copy()
21
 
22
  # Workaround: get necessary opt args
23
  opt = parse_opt(known=True)