admin管理员组文章数量:1130349
问题描述
书接上回,也是在攻防项目中遇到的问题RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward.
原因分析和解决
通过描述我们可以发现,是在backward第二次的时候张量或者是其中的某些中间结果被释放了,所以导致了运行时候异常的出现,改的办法也很简单
把.backward()
改为.backward(retain_graph=True)
这样一来,在次运行,这个问题就解决了。
完结撒花
但是呢,又出现了新的问题,很头疼:RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn
问题描述
书接上回,也是在攻防项目中遇到的问题RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward.
原因分析和解决
通过描述我们可以发现,是在backward第二次的时候张量或者是其中的某些中间结果被释放了,所以导致了运行时候异常的出现,改的办法也很简单
把.backward()
改为.backward(retain_graph=True)
这样一来,在次运行,这个问题就解决了。
完结撒花
但是呢,又出现了新的问题,很头疼:RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn
本文标签: 算法RuntimeErrorGraphsavedAccess
版权声明:本文标题:算法【已解决】RuntimeError: Trying to backward through the graph a second time (or directly access saved 内容由热心网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:https://it.en369.cn/jiaocheng/1755027385a2755210.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。


发表评论