Browse Source

修改GradientClipCallback中parameter的存储方式,防止在torch 1.5版本中报错

tags/v0.5.5
yh_cc 5 years ago
parent
commit
62fe53b147
1 changed files with 1 additions and 1 deletions
  1. +1
    -1
      fastNLP/core/callback.py

+ 1
- 1
fastNLP/core/callback.py View File

@@ -464,7 +464,7 @@ class GradientClipCallback(Callback):
self.clip_fun = nn.utils.clip_grad_value_
else:
raise ValueError("Only supports `norm` or `value` right now.")
self.parameters = parameters
self.parameters = list(parameters)
self.clip_value = clip_value
def on_backward_end(self):


Loading…
Cancel
Save