如何根据某个类计算图像的梯度?
Posted
技术标签:
【中文标题】如何根据某个类计算图像的梯度?【英文标题】:How to compute the gradient of an image based on a certain class? 【发布时间】:2017-05-09 15:42:16 【问题描述】:在Breaking Linear Classifiers on ImageNet中,作者提出了以下方法来创建欺骗ConvNets的对抗性图像:
简而言之,要创建一个愚蠢的图像,我们从我们的任何图像开始 想要(实际图像,甚至是噪声模式),然后使用 反向传播来计算任何图像像素的梯度 班级分数,然后轻推它。我们可以但不必重复 该过程几次。您可以在此解释反向传播 设置为使用动态规划来计算最具破坏性的 输入的局部扰动。注意这个过程非常 如果您可以访问 ConvNet 的参数(反向传播速度很快),但可以这样做 即使您无权访问参数,但只能访问 最后的班级成绩。在这种情况下,可以计算 数据梯度数值,或使用其他局部随机搜索 策略等。请注意,由于后一种方法,即使 不可微分类器(例如随机森林)不安全(但 我还没有看到任何人凭经验证实这一点)。
我知道我可以像这样计算图像的梯度:
np.gradient(img)
但是如何使用 TensorFlow 或 Numpy 计算图像相对于另一个图像类的梯度?可能我需要做一些类似于in this tutorial 的过程?如:
cross_entropy = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(y_conv, y_))
train_step = tf.train.AdamOptimizer(1e-4).minimize(cross_entropy)
correct_prediction = tf.equal(tf.argmax(y_conv,1), tf.argmax(y_,1))
accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))
sess.run(tf.initialize_all_variables())
for i in range(20000):
batch = mnist.train.next_batch(50)
if i%100 == 0:
train_accuracy = accuracy.eval(feed_dict=
x:batch[0], y_: batch[1], keep_prob: 1.0)
print("step %d, training accuracy %g"%(i, train_accuracy))
train_step.run(feed_dict=x: batch[0], y_: batch[1], keep_prob: 0.5)
print("test accuracy %g"%accuracy.eval(feed_dict=
x: mnist.test.images, y_: mnist.test.labels, keep_prob: 1.0))
但我不确定具体是如何...具体来说,我有一个数字 2 的图像,如下所示:
array([[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0.14117648, 0.49019611, 0.74901962,
0.85490203, 1. , 0.99607849, 0.99607849, 0.9450981 ,
0.20000002, 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0.80000007, 0.97647065, 0.99215692, 0.99215692,
0.99215692, 0.99215692, 0.99215692, 0.99215692, 0.99215692,
0.98039222, 0.92156869, 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0.34509805,
0.9450981 , 0.98431379, 0.99215692, 0.88235301, 0.55686277,
0.19215688, 0.04705883, 0.04705883, 0.04705883, 0.41176474,
0.99215692, 0.99215692, 0.43529415, 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0.37254903, 0.88235301,
0.99215692, 0.65490198, 0.44313729, 0.05490196, 0. ,
0. , 0. , 0. , 0. , 0.0627451 ,
0.82745105, 0.99215692, 0.45882356, 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0.35686275, 0.9333334 , 0.99215692,
0.66666669, 0.10980393, 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0.58823532, 0.99215692, 0.45882356, 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0.38431376, 0.98431379, 0.85490203, 0.18823531,
0.01960784, 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0.58823532, 0.99215692, 0.45882356, 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0.43921572, 0.99215692, 0.43921572, 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0.03529412,
0.72156864, 0.94901967, 0.07058824, 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0.07843138, 0.17647059, 0.01960784, 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0.26274511,
0.99215692, 0.94117653, 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0.10588236, 0.91764712,
0.97254908, 0.41176474, 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0.17254902, 0.6156863 , 0.99215692,
0.51764709, 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0.04313726, 0.74117649, 0.99215692, 0.7960785 ,
0.10588236, 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0.04313726, 0.61176473, 0.99215692, 0.96470594, 0.3019608 ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0.04313726,
0.61176473, 0.99215692, 0.79215693, 0.26666668, 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0.04313726, 0.61176473,
0.99215692, 0.88627458, 0.27843139, 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0.11764707, 0.12941177,
0.12941177, 0.54901963, 0.63921571, 0.72941178, 0.99215692,
0.88627458, 0.14901961, 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0.04705883, 0.31764707, 0.95686281, 0.99215692,
0.99215692, 0.99215692, 0.99215692, 0.99215692, 0.99215692,
0.99215692, 0.72941178, 0.27450982, 0.09019608, 0. ,
0. , 0.08627451, 0.61176473, 0.3019608 , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0.3137255 , 0.76470596, 0.99215692, 0.99215692, 0.99215692,
0.99215692, 0.99215692, 0.97254908, 0.91764712, 0.65098041,
0.97254908, 0.99215692, 0.99215692, 0.94117653, 0.58823532,
0.28627452, 0.56470591, 0.40784317, 0.20000002, 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0.02745098,
0.97254908, 0.99215692, 0.99215692, 0.99215692, 0.99215692,
0.99215692, 0.94901967, 0.41176474, 0. , 0. ,
0.41960788, 0.94901967, 0.99215692, 0.99215692, 0.99215692,
0.96078438, 0.627451 , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0.22352943,
0.98039222, 0.99215692, 0.99215692, 0.99215692, 0.96862751,
0.52941179, 0.08235294, 0. , 0. , 0. ,
0. , 0.08235294, 0.45882356, 0.71764708, 0.71764708,
0.18823531, 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0.47450984, 0.48235297, 0.6901961 , 0.52941179, 0.0627451 ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ]], dtype=float32)
如何计算此图像相对于数字 6 图像类的梯度(示例如下)? (我想我需要使用反向传播来计算所有 6 位图像的梯度。)
array([[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0.19215688, 0.70588237, 0.99215692,
0.95686281, 0.19607845, 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0.72156864, 0.98823535, 0.98823535,
0.90980399, 0.64313728, 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0.25882354, 0.91764712, 0.98823535, 0.53333336,
0.14901961, 0.21960786, 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0.07450981, 0.92549026, 0.98823535, 0.6901961 , 0.01568628,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0.29803923, 0.98823535, 0.98823535, 0.21960786, 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0.54509807, 0.99215692, 0.67843139, 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0.08627451,
0.83137262, 0.98823535, 0.27058825, 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0.45490199,
0.99215692, 0.94117653, 0.19607845, 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0.6156863 ,
0.99215692, 0.80784321, 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0.90196085,
0.99215692, 0.40000004, 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0.90588242,
1. , 0.70588237, 0.5411765 , 0.70588237, 0.99215692,
1. , 0.99215692, 0.8705883 , 0.38039219, 0.01176471,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0.90196085,
0.99215692, 0.98823535, 0.98823535, 0.98823535, 0.98823535,
0.82745105, 0.98823535, 0.98823535, 0.98823535, 0.45882356,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0.90196085,
0.99215692, 0.94117653, 0.71764708, 0.34901962, 0.27058825,
0.02745098, 0.27058825, 0.67058825, 0.98823535, 0.98823535,
0.33333334, 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0.52941179,
0.99215692, 0.60000002, 0. , 0. , 0. ,
0. , 0. , 0.0509804 , 0.84313732, 0.98823535,
0.45490199, 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0.45490199,
0.99215692, 0.80784321, 0. , 0. , 0. ,
0. , 0. , 0. , 0.60784316, 0.98823535,
0.45490199, 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0.41568631,
1. , 0.82745105, 0.02745098, 0. , 0. ,
0. , 0. , 0.19215688, 0.91372555, 0.99215692,
0.45490199, 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0.62352943, 0.98823535, 0.60392159, 0.03529412, 0. ,
0. , 0.11764707, 0.77254909, 0.98823535, 0.98823535,
0.37254903, 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0.06666667, 0.89019614, 0.98823535, 0.60392159, 0.27450982,
0.31764707, 0.89411771, 0.98823535, 0.89019614, 0.50980395,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0.19607845, 0.89019614, 0.98823535, 0.98823535,
0.99215692, 0.98823535, 0.72549021, 0.19607845, 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0.18823531, 0.7019608 , 0.98823535,
0.74509805, 0.45882356, 0.02352941, 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ],
[ 0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. , 0. , 0. ,
0. , 0. , 0. ]], dtype=float32)
提前感谢您的帮助!
我问了两个相关的问题:
How to use image and weight matrix to create adversarial images in TensorFlow?
How to create adversarial images for ConvNet?
这里是my script。
【问题讨论】:
“图像相对于其他图像的渐变”是什么意思?你是说两者的区别吗?您是指图像的渐变吗? 你只能计算一个函数的梯度。例如,如果函数将图像(输入)映射到类别分数(输出)的向量,您可以计算输出相对于输入的梯度。但是在这里你说你想计算一个图像相对于另一个图像的梯度,你需要一个将图像映射到图像的函数来做到这一点。 This answer 可能会帮助你理解。 @GeorgeLiu:关于深度梦想的 tensorflow 教程有一个示例,该示例将图像相对于某个层的梯度。特别是 [this section] (nbviewer.jupyter.org/github/tensorflow/tensorflow/blob/master/…) 应该会有所帮助(即 t_grad = tf.gradients(t_score, t_input)[0])。注意:您引用的博客没有采用一张图像相对于另一张图像的渐变。它正在获取图像相对于类分数的梯度。而在深切的梦想中,班级成绩只是另一层。 这个链接有一个对抗网络的视频和一个有据可查的例子。 github.com/Hvass-Labs/TensorFlow-Tutorials @chris_anderson 分类器神经网络的输出通常是一个向量,每个类有一个元素,其中每个元素(有效地)近似于输入图像属于该类的概率(或最小化训练图像上的日志损失)。给一个猫对狗分类器网络,一张猫的图像可能会产生一个输出 (0.9, 0.1) 或 90% 的猫。这个问题询问的梯度是输入图像的像素相对于网络输出的元素之一(类概率)。它不是空间坐标的图像梯度。 【参考方案1】:仅限班级成绩
如果您只能访问您建议的任何图像的班级分数,那么您就无法真正计算梯度。
如果返回的内容可以看作是每个类别的相对分数,则它是一个向量 v
,它是某个函数 f
作用于包含图像上所有信息的向量 A
的结果* .函数的真实梯度由矩阵D(A)
给出,该矩阵取决于A
,因此D(A)*B = (f(A + epsilon*B) -f(A))/epsilon
在小epsilon
的限制中对于任何B
。您可以使用一些小的 epsilon 值和多个测试矩阵 B
(A
的每个元素一个就足够了)来近似此数值,但这可能会不必要地昂贵。
您要做的是最大限度地提高算法识别图像的难度。也就是说,对于给定的算法f
,您希望最大化一些适当的度量来衡量该算法识别您的每个图像A
的糟糕程度。有很多方法可以做到这一点。我对它们不太熟悉,但我最近看到的一个演讲有一些有趣的材料(https://wsc.project.cwi.nl/woudschoten-conferences/2016-woudschoten-conference/PRtalk1.pdf,见第 24 页及以后)。如果你有高维输入,计算整个梯度通常太昂贵了。相反,您只需修改一个随机选择的坐标,并在正确的方向上或多或少地采取许多(许多)小而便宜的步骤,而不是采取某种最佳的大而昂贵的步骤。
型号可用且合适
如果您完全了解模型并且可以明确写为v = f(A)
,那么您可以计算函数f
的梯度。如果您尝试击败的算法是线性回归,可能是多层的,就会出现这种情况。渐变的形式应该比我在这里写的更容易理解。
有了这个梯度可用并且相当便宜地评估它对不同图像的价值A
,您可以继续使用例如最陡下降(或上升)方法来使算法难以识别图像。
重要提示
最好不要忘记您的方法也不应该使图像对人类难以辨认,这会使这一切变得毫无意义。
我认为,出于本次讨论的目的,最好将图像视为矢量【讨论】:
以上是关于如何根据某个类计算图像的梯度?的主要内容,如果未能解决你的问题,请参考以下文章