Pytorch解决使用BucketIterator.splits警告volatile was removed and now has no effect. Use `with torch.no_g(代
Posted Better Bench
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了Pytorch解决使用BucketIterator.splits警告volatile was removed and now has no effect. Use `with torch.no_g(代相关的知识,希望对你有一定的参考价值。
问题
使用data.BucketIterator.splits去封装训练集和验证集,在遍历的时候出现警告UserWarning: volatile was removed and now has no effect. Use with torch.no_grad():
instead. return Variable(arr, volatile=not train)
from torchtext import data
train_iter,valid_iter= data.BucketIterator.splits((train_data,valid_data),
batch_size=batch_size,
sort_key=lambda x: len(x.text),
repeat=False,
shuffle=True)
with torch.no_grad():
for idx, batch in enumerate(val_iter):# 在一行出现警告
pass
解决
分开封装,不用data.BucketIterator.splits而是用data.BucketIterator
from torchtext import data
train_iter = data.BucketIterator((train_data), batch_size=batch_size,
sort_key=lambda x: len(x.text),
repeat=False, shuffle=True)
valid_iter = data.BucketIterator((valid_data),
batch_size=batch_size,
sort_key=lambda x: len(x.text),
repeat=False,
shuffle=True)
以上是关于Pytorch解决使用BucketIterator.splits警告volatile was removed and now has no effect. Use `with torch.no_g(代的主要内容,如果未能解决你的问题,请参考以下文章
Pytorch Text AttributeError:“BucketIterator”对象没有属性