Why use Variable() in inference?(为什么在推理中使用 Variable()?)
问题描述
我正在学习 PyTorch 用于图像分类任务,我遇到了有人在其函数中使用 PyTorch Variable()
进行预测的代码:
I am learning PyTorch for an image classification task, and I ran into code where someone used a PyTorch Variable()
in their function for prediction:
def predict_image(image):
image_tensor = test_transforms(image).float()
image_tensor = image_tensor.unsqueeze_(0)
input = Variable(image_tensor)
input = input.to(device)
output = model(input)
index = output.data.cpu().numpy().argmax()
return index
为什么他们在这里使用 Variable()
?(即使没有它也能正常工作.)
Why do they use Variable()
here? (even though it works fine without it.)
推荐答案
您可以放心地省略它.变量是 PyTorch 的遗留组件,现已弃用,过去是 autograd 所必需的:
You can safely omit it. Variables are a legacy component of PyTorch, now deprecated, that used to be required for autograd:
警告
Variable
API 已被弃用:不再需要变量来将 autograd 与张量一起使用.Autograd 自动支持 requires_grad
设置为 True
的张量.请在下方找到有关更改内容的快速指南:
The Variable
API has been deprecated: Variables are no longer necessary to use autograd with tensors. Autograd automatically supports Tensors with requires_grad
set to True
. Below please find a quick guide on what has changed:
Variable(tensor)
和Variable(tensor, requires_grad)
仍然按预期工作,但它们返回的是张量而不是变量.
Variable(tensor)
andVariable(tensor, requires_grad)
still work as expected, but they return Tensors instead of Variables.
这篇关于为什么在推理中使用 Variable()?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持编程学习网!