('试图更新一个张量', <tf.Tensor: shape=(), dtype=float32, numpy=3.0>)
Posted
技术标签:
【中文标题】(\'试图更新一个张量\', <tf.Tensor: shape=(), dtype=float32, numpy=3.0>)【英文标题】:('Trying to update a Tensor ', <tf.Tensor: shape=(), dtype=float32, numpy=3.0>)('试图更新一个张量', <tf.Tensor: shape=(), dtype=float32, numpy=3.0>) 【发布时间】:2021-03-12 21:17:53 【问题描述】:我正在尝试运行此处显示的示例:
https://www.tensorflow.org/api_docs/python/tf/keras/optimizers/Optimizer
但它给了我这个错误:
我正在使用带有 python 3 的 linux
import tensorflow as tf
import numpy as np
var1=tf.constant(3.0)
var2=tf.constant(3.0)
# Create an optimizer with the desired parameters.
opt = tf.keras.optimizers.SGD(learning_rate=0.1)
# `loss` is a callable that takes no argument and returns the value
# to minimize.
loss = lambda: 3 * var1 * var1 + 2 * var2 * var2
# In graph mode, returns op that minimizes the loss by updating the listed
# variables.
opt_op = opt.minimize(loss, var_list=[var1, var2])
opt_op.run()
# In eager mode, simply call minimize to update the list of variables.
opt.minimize(loss, var_list=[var1, var2])
---------------------------------------------------------------------------
NotImplementedError Traceback (most recent call last)
<ipython-input-1-f7fa46c26670> in <module>()
12 # In graph mode, returns op that minimizes the loss by updating the listed
13 # variables.
---> 14 opt_op = opt.minimize(loss, var_list=[var1, var2])
15 opt_op.run()
16 # In eager mode, simply call minimize to update the list of variables.
10 frames
/usr/local/lib/python3.6/dist-packages/tensorflow/python/keras/optimizer_v2/optimizer_v2.py in apply_grad_to_update_var(var, grad)
592 """Apply gradient to variable."""
593 if isinstance(var, ops.Tensor):
--> 594 raise NotImplementedError("Trying to update a Tensor ", var)
595
596 apply_kwargs =
NotImplementedError: ('Trying to update a Tensor ', <tf.Tensor: shape=(), dtype=float32, numpy=3.0>)
【问题讨论】:
你需要使用tf.Variable
,而不是constant
。
【参考方案1】:
正如@xdurch0 所建议的,使用 tf.Variable 代替 tf.constant。
请检查下面的工作示例代码。
import tensorflow as tf
import numpy as np
var1=tf.Variable(3.0)
var2=tf.Variable(3.0)
opt = tf.keras.optimizers.SGD(learning_rate=0.1)
# `loss` is a callable that takes no argument and returns the value
# to minimize.
loss = lambda: 3 * var1 * var1 + 2 * var2 * var2
# In graph mode, returns op that minimizes the loss by updating the listed
# variables.
#opt_op = opt.minimize(loss, var_list=[var1, var2])
#opt_op.run()
# In eager mode, simply call minimize to update the list of variables.
opt.minimize(loss, var_list=[var1, var2])
opt.variables()
输出
<function <lambda> at 0x7efdebc7f048>
[<tf.Variable 'SGD/iter:0' shape=() dtype=int64, numpy=1>]
【讨论】:
以上是关于('试图更新一个张量', <tf.Tensor: shape=(), dtype=float32, numpy=3.0>)的主要内容,如果未能解决你的问题,请参考以下文章
BCEWithLogitsLoss:试图将预测标签的二进制输出作为张量,与输出层混淆