Skip to content

Commit 86ba4c0

Browse files
authored
Merge pull request #44 from bigcode-project/fix-dist-opt
fix distributed optimizer
2 parents c41f2b1 + b691302 commit 86ba4c0

File tree

1 file changed

+0
-3
lines changed

1 file changed

+0
-3
lines changed

megatron/training.py

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -391,9 +391,6 @@ def setup_model_and_optimizer(model_provider_func,
391391
torch.distributed.barrier()
392392
timers('load-checkpoint').stop()
393393
timers.log(['load-checkpoint'])
394-
# This is critical when only model is loaded. We should make sure
395-
# main parameters are also updated.
396-
optimizer.reload_model_params()
397394
else:
398395
args.iteration = 0
399396

0 commit comments

Comments
 (0)