RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn

Posted AI浩

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn相关的知识,希望对你有一定的参考价值。

问题描述

在重新加入模型,再训练的时候出现了如下问题:

Traceback (most recent call last):
  File "D:\\Ghost_Demo\\train.py", line 200, in <module>
    train_loss, train_acc = train(model_ft, DEVICE, train_loader, optimizer, epoch,model_ema)
  File "D:\\Ghost_Demo\\train.py", line 40, in train
    scaler.scale(loss).backward()
  File "D:\\Users\\wh109\\anaconda3\\lib\\site-packages\\torch\\_tensor.py", line 396, in backward
    torch.autograd.backward(self, gradient, retain_graph, create_graph, inputs=inputs)
  File "D:\\Users\\wh109\\anaconda3\\lib\\site-packages\\torch\\autograd\\__init__.py", line 173, in backward
    Variable._execution_engine.run_backward(  # Calls into the C++ engine to run the backward pass
RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn

原因

出现这种错误是因为,构建Variable, 要注意得传入一个参数requires_grad=True, 这个参数表示是否对这个变量求梯度, 默认的是False, 也就是不对这个变量求梯度。

解决方法

data, target = data.to(device, non_blocking=True), target.to(device, non_blocking=True)

改为:

from torch.autograd import Variable
data, target = Variable(data, requires_grad=True).to(device, non_blocking=True), target.to(device,non_blocking=True)

以上是关于RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn的主要内容,如果未能解决你的问题,请参考以下文章

E-03RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn

PyTorch:RuntimeError:变量元组的元素0是易失性的

RuntimeError:张量的元素 0 不需要 grad 并且没有 grad_fn

RuntimeError: each element in list of batch should be of equal size

Pytorch RuntimeError:张量的元素 0 不需要 grad 并且没有 grad_fn

229. Majority Element II