Pytorch. Can autograd be used when the final tensor has more than a single value in it?(火炬.当最终张量中有多个值时可以使用 autograd 吗?)
问题描述
当最终张量中有多个值时可以使用 autograd 吗?
Can autograd be used when the final tensor has more than a single value in it?
我尝试了以下方法.
x = torch.tensor([4.0, 5.0], requires_grad=True)
y = x ** 2
print(y)
y.backward()
抛出错误
RuntimeError: grad can be implicitly created only for scalar outputs
但是以下有效.
x = torch.tensor([4.0, 5.0], requires_grad=True)
y = x ** 2
y = torch.sum(y)
print(y)
y.backward()
print(x.grad)
输出为
tensor(41., grad_fn=<SumBackward0>)
tensor([ 8., 10.])
我是否在这里遗漏了什么,或者我可以继续假设 autograd 仅在最终张量中有单个值时才有效?
Am I missing something here or can I proceed with the assumption that autograd only works when the final tensor has a single value in it?
推荐答案
参见 https://pytorch.org/tutorials/beginner/blitz/autograd_tutorial.html#gradients
y.backward()
等同于 y.backward(torch.tensor(1.0))
通常,输出是标量,因此标量作为默认选择向后传递.但是,由于您的输出是二维的,因此您应该调用y.backward(torch.tensor([1.0,1.0]))
Usually, the output is scalar and hence the scalar is passed to backward as a default choice. However, since your output is two dimensional you should call
y.backward(torch.tensor([1.0,1.0]))
这将给出 x.grad
为 tensor([ 8., 10.])
这篇关于火炬.当最终张量中有多个值时可以使用 autograd 吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持编程学习网!