Skip to content

Commit a88a814

Browse files
n1mmyglenn-jocher
andauthored
Copy wandb param dict before training to avoid overwrites (#7317)
* Copy wandb param dict before training to avoid overwrites. Copy the hyperparameter dict retrieved from wandb configuration before passing it to `train()`. Training overwrites parameters in the dictionary (eg scaling obj/box/cls gains), which causes the values reported in wandb to not match the input values. This is confusing as it makes it hard to reproduce a run, and also throws off wandb's Bayesian sweep algorithm. * Cleanup Co-authored-by: Glenn Jocher <[email protected]>
1 parent 245d645 commit a88a814

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

utils/loggers/wandb/sweep.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -16,8 +16,8 @@
1616

1717
def sweep():
1818
wandb.init()
19-
# Get hyp dict from sweep agent
20-
hyp_dict = vars(wandb.config).get("_items")
19+
# Get hyp dict from sweep agent. Copy because train() modifies parameters which confused wandb.
20+
hyp_dict = vars(wandb.config).get("_items").copy()
2121

2222
# Workaround: get necessary opt args
2323
opt = parse_opt(known=True)

0 commit comments

Comments
 (0)