RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been
freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if
you need to backward through the graph a second time or if you need to access saved tensors after calling backward.
- 这是由于中间层表示被删除了
self.optimizer.zero_grad()
这个在清除梯度同时,会将所有用于计算图的tensor中间层表示都删除
但如果不使用这个,会报错imtermedidate layer的版本号不对