Skip to content

Commit 473f91d

Browse files
authored
for diffgrad, clone gradient before saving as previous gradient (#57)
1 parent f53d18b commit 473f91d

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

torch_optimizer/diffgrad.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -114,7 +114,7 @@ def step(self, closure: OptLossClosure = None) -> OptFloat:
114114
# compute diffgrad coefficient (dfc)
115115
diff = torch.abs(previous_grad - grad)
116116
dfc = torch.div(1.0, (1.0 + torch.exp(-diff)))
117-
state['previous_grad'] = grad
117+
state['previous_grad'] = grad.clone()
118118

119119
# update momentum with dfc
120120
exp_avg1 = exp_avg * dfc

0 commit comments

Comments
 (0)