tensorflow完成mnist实验的处理分析总结

Posted phonard

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了tensorflow完成mnist实验的处理分析总结相关的知识,希望对你有一定的参考价值。

数据集的读取

通过tensorflow框架的input_data方法读取本地的mnist数据集,采用one-hot编码。 注:以下代码不完整,仅部分展示以便理解。

 1 from tensorflow.examples.tutorials.mnist import input_data
 2 
 3 mnist = input_data.read_data_sets("./mnist/", one_hot=True)
 4 xs, ys = mnist.train.next_batch(BATCH_SIZE)  
 5 # BATCH_SIZE=100
 6 
 7 feed_dict = {
 8                 input_x: xs,
 9                 input_y: ys
10             }
11 
12 _, step, y_pred_out, train_loss, train_acc = sess.run([train_op, global_step, y_pred, loss, accuracy], feed_dict=feed_dict)
13 
14 print("xs shape:{}".format(np.shape(xs)))
15 print("ys shape:{}".format(np.shape(ys)))
16 print("ys:", ys)
17 print("y_pred shape:", y_pred_out.shape)
18 print("y_pred value:", y_pred_out)

  采用one-hot编码之后的其中一个输出效果如下,mnist的数据集的标签是10个(0-9)。每次训练的时候采用100个数据,从下面的输出可知,60000*28*28的图像经过上面

的第四行语句得到的输入图像的维度是(100,784),这里的100是图像的数量,即根据BATCH_SIZE设置得到,784是28*28图像经过reshape之后所得。ys是xs对应的标签值,

输出的值是one-hot编码,即对应的最大的值为1,其他为0。通过神经网络输出的y_pred_out的值如下,通过softmax交叉熵语句和求和语句获取损失值。

 

  该两条语句主要是先对神经网络输出的10个预测值y_pred逐个用softmax的公式技术图片求出对应的各个值的概率值,然后再用交叉熵公式技术图片 进行运算input_y*tf.log(y_pred),最后求和。这里的标签是one-hot编码得出的,只有0和1,因此只会得到标签值对应的那个交叉熵的值。

具体分析参考此处:https://www.jianshu.com/p/648d791b55b0

cross_entropy = tf.nn.softmax_cross_entropy_with_logits(logits=y_pred, labels=input_y)
loss = tf.reduce_mean(cross_entropy)

 

  1 xs shape:(100, 784)
  2 ys shape:(100, 10)
  3 ys: [[0. 0. 0. 0. 1. 0. 0. 0. 0. 0.]
  4  [0. 1. 0. 0. 0. 0. 0. 0. 0. 0.]
  5  [0. 0. 0. 0. 0. 0. 0. 0. 1. 0.]
  6  [0. 0. 0. 0. 0. 0. 1. 0. 0. 0.]
  7  [0. 1. 0. 0. 0. 0. 0. 0. 0. 0.]
  8  [0. 0. 0. 1. 0. 0. 0. 0. 0. 0.]
  9  [0. 0. 0. 0. 0. 0. 0. 0. 0. 1.]
 10  [0. 0. 1. 0. 0. 0. 0. 0. 0. 0.]
 11  [0. 0. 0. 1. 0. 0. 0. 0. 0. 0.]
 12  [0. 0. 0. 1. 0. 0. 0. 0. 0. 0.]
 13  [0. 0. 0. 0. 0. 0. 1. 0. 0. 0.]
 14  [0. 1. 0. 0. 0. 0. 0. 0. 0. 0.]
 15  [0. 0. 0. 0. 0. 0. 0. 1. 0. 0.]
 16  [0. 0. 0. 1. 0. 0. 0. 0. 0. 0.]
 17  [0. 0. 0. 0. 0. 0. 0. 1. 0. 0.]
 18  [0. 1. 0. 0. 0. 0. 0. 0. 0. 0.]
 19  [0. 0. 0. 0. 1. 0. 0. 0. 0. 0.]
 20  [0. 0. 1. 0. 0. 0. 0. 0. 0. 0.]
 21  [0. 0. 0. 0. 0. 1. 0. 0. 0. 0.]
 22  [0. 0. 0. 0. 0. 1. 0. 0. 0. 0.]
 23  [0. 0. 0. 0. 1. 0. 0. 0. 0. 0.]
 24  [0. 0. 0. 1. 0. 0. 0. 0. 0. 0.]
 25  [1. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
 26  [0. 0. 1. 0. 0. 0. 0. 0. 0. 0.]
 27  [0. 0. 0. 0. 0. 0. 0. 0. 0. 1.]
 28  [0. 0. 0. 1. 0. 0. 0. 0. 0. 0.]
 29  [0. 0. 1. 0. 0. 0. 0. 0. 0. 0.]
 30  [0. 0. 1. 0. 0. 0. 0. 0. 0. 0.]
 31  [0. 0. 0. 0. 0. 0. 0. 0. 1. 0.]
 32  [0. 0. 0. 0. 0. 0. 1. 0. 0. 0.]
 33  [0. 0. 0. 0. 1. 0. 0. 0. 0. 0.]
 34  [0. 1. 0. 0. 0. 0. 0. 0. 0. 0.]
 35  [0. 0. 0. 0. 0. 0. 0. 1. 0. 0.]
 36  [0. 0. 0. 0. 0. 0. 1. 0. 0. 0.]
 37  [0. 0. 0. 0. 0. 1. 0. 0. 0. 0.]
 38  [0. 0. 0. 1. 0. 0. 0. 0. 0. 0.]
 39  [0. 0. 0. 0. 0. 0. 0. 0. 1. 0.]
 40  [0. 0. 1. 0. 0. 0. 0. 0. 0. 0.]
 41  [0. 0. 0. 0. 0. 0. 0. 0. 1. 0.]
 42  [1. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
 43  [0. 1. 0. 0. 0. 0. 0. 0. 0. 0.]
 44  [1. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
 45  [0. 0. 0. 1. 0. 0. 0. 0. 0. 0.]
 46  [0. 0. 0. 0. 0. 0. 0. 0. 1. 0.]
 47  [0. 0. 0. 0. 0. 0. 0. 0. 0. 1.]
 48  [0. 0. 0. 0. 0. 1. 0. 0. 0. 0.]
 49  [0. 0. 1. 0. 0. 0. 0. 0. 0. 0.]
 50  [0. 0. 0. 0. 0. 0. 0. 1. 0. 0.]
 51  [1. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
 52  [0. 0. 0. 0. 0. 0. 0. 1. 0. 0.]
 53  [0. 0. 0. 0. 0. 0. 0. 0. 0. 1.]
 54  [0. 0. 0. 0. 0. 0. 1. 0. 0. 0.]
 55  [0. 0. 0. 0. 0. 0. 0. 0. 1. 0.]
 56  [0. 0. 0. 0. 0. 1. 0. 0. 0. 0.]
 57  [0. 0. 0. 0. 0. 0. 0. 1. 0. 0.]
 58  [0. 0. 0. 0. 0. 1. 0. 0. 0. 0.]
 59  [0. 0. 0. 0. 0. 0. 0. 1. 0. 0.]
 60  [0. 0. 0. 0. 1. 0. 0. 0. 0. 0.]
 61  [1. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
 62  [0. 0. 0. 0. 0. 0. 0. 0. 1. 0.]
 63  [0. 0. 0. 0. 0. 0. 0. 0. 0. 1.]
 64  [0. 1. 0. 0. 0. 0. 0. 0. 0. 0.]
 65  [0. 0. 1. 0. 0. 0. 0. 0. 0. 0.]
 66  [0. 0. 0. 0. 0. 0. 0. 0. 0. 1.]
 67  [0. 1. 0. 0. 0. 0. 0. 0. 0. 0.]
 68  [0. 0. 0. 0. 0. 0. 0. 0. 1. 0.]
 69  [1. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
 70  [1. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
 71  [0. 0. 0. 0. 1. 0. 0. 0. 0. 0.]
 72  [0. 0. 0. 0. 0. 0. 0. 0. 0. 1.]
 73  [0. 1. 0. 0. 0. 0. 0. 0. 0. 0.]
 74  [0. 0. 0. 0. 0. 0. 0. 0. 0. 1.]
 75  [0. 0. 0. 1. 0. 0. 0. 0. 0. 0.]
 76  [0. 0. 0. 0. 0. 0. 0. 1. 0. 0.]
 77  [0. 0. 0. 1. 0. 0. 0. 0. 0. 0.]
 78  [1. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
 79  [0. 0. 0. 0. 0. 0. 1. 0. 0. 0.]
 80  [0. 1. 0. 0. 0. 0. 0. 0. 0. 0.]
 81  [1. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
 82  [0. 1. 0. 0. 0. 0. 0. 0. 0. 0.]
 83  [0. 0. 1. 0. 0. 0. 0. 0. 0. 0.]
 84  [0. 0. 0. 0. 0. 1. 0. 0. 0. 0.]
 85  [0. 0. 0. 0. 0. 0. 0. 0. 1. 0.]
 86  [0. 0. 0. 0. 0. 1. 0. 0. 0. 0.]
 87  [0. 0. 0. 0. 1. 0. 0. 0. 0. 0.]
 88  [0. 0. 0. 1. 0. 0. 0. 0. 0. 0.]
 89  [0. 1. 0. 0. 0. 0. 0. 0. 0. 0.]
 90  [0. 1. 0. 0. 0. 0. 0. 0. 0. 0.]
 91  [0. 0. 0. 0. 0. 0. 0. 1. 0. 0.]
 92  [1. 0. 0. 0. 0. 0. 0. 0. 0. 0.]
 93  [0. 0. 1. 0. 0. 0. 0. 0. 0. 0.]
 94  [0. 0. 0. 0. 0. 0. 0. 0. 1. 0.]
 95  [0. 0. 0. 0. 0. 1. 0. 0. 0. 0.]
 96  [0. 1. 0. 0. 0. 0. 0. 0. 0. 0.]
 97  [0. 0. 0. 1. 0. 0. 0. 0. 0. 0.]
 98  [0. 0. 0. 0. 0. 1. 0. 0. 0. 0.]
 99  [0. 0. 0. 0. 1. 0. 0. 0. 0. 0.]
100  [0. 0. 0. 0. 0. 0. 1. 0. 0. 0.]
101  [0. 0. 0. 0. 0. 0. 0. 0. 0. 1.]
102  [1. 0. 0. 0. 0. 0. 0. 0. 0. 0.]]
103 y_pred shape: (100, 10)
104 y_pred value: [[-3.43147159e+00 -7.40697002e+00 -5.15638876e+00 -7.81467533e+00
105    1.17764559e+01 -3.70678473e+00  1.27706754e+00 -3.12552333e-01
106   -5.46425915e+00  8.31084669e-01]
107  [-6.51548719e+00  9.20612431e+00 -2.78829408e+00 -2.45178580e+00
108   -2.85427904e+00 -2.58734512e+00 -1.88677633e+00 -1.97383308e+00
109   -4.37865198e-01 -3.41498566e+00]
110  [-3.16933393e+00 -2.19506049e+00 -3.30737209e+00 -2.21414733e+00
111   -1.07381320e+00  1.51002157e+00 -2.97042227e+00 -3.32907343e+00
112    4.93531418e+00 -1.37362480e+00]
113  [-1.22948694e+00  4.49715585e-01 -2.11010861e+00 -2.14119720e+00
114   -8.59284341e-01 -1.82794511e+00  3.47176266e+00 -2.92995453e+00
115   -1.63972914e-01 -3.31165552e+00]
116  [-6.15973949e+00  1.03094177e+01 -7.17151880e-01 -2.46371531e+00
117   -2.81105447e+00 -2.19281197e+00 -4.14533663e+00  1.39345062e+00
118   -1.22282195e+00 -6.90686989e+00]
119  [-2.58981854e-01  2.59471759e-02 -1.29580390e+00  2.10660958e+00
120   -2.72546625e+00  1.21887481e+00 -1.36797750e+00 -4.94224310e-01
121   -1.37599611e+00 -5.68572402e-01]
122  [-4.07663107e+00 -9.13997555e+00 -4.51832199e+00 -1.72402120e+00
123    1.10608399e+00 -2.44688630e+00 -8.17761898e+00  3.34035921e+00
124   -2.23232794e+00  6.32807684e+00]
125  [ 3.21722060e-01 -1.93337655e+00  2.03051829e+00  1.10554433e+00
126   -2.05161524e+00 -1.16088104e+00 -1.75546360e+00 -2.04713345e-01
127   -2.31031513e+00 -1.24849582e+00]
128  [-7.34535694e-01 -2.22563291e+00 -1.81636095e+00  3.51452160e+00
129   -3.95577526e+00  1.77771974e+00 -4.57862759e+00 -5.13235569e-01
130    5.18334270e-01  2.06159800e-02]
131  [-3.71348882e+00 -2.78583598e+00 -4.62314755e-01  6.95762444e+00
132   -3.37151146e+00 -3.04426342e-01 -7.25085163e+00  2.22509146e-01
133    1.45623326e+00  1.10429978e+00]
134  [-1.69454861e+00  3.51319313e-01  2.96807617e-01 -2.87734389e+00
135   -3.60054433e-01 -3.04964566e+00  3.53769755e+00 -9.99551415e-01
136   -1.46237707e+00 -4.84274101e+00]
137  [-7.16958523e+00  9.15426064e+00 -3.63996053e+00 -1.09802735e+00
138   -2.24934888e+00 -3.86852932e+00 -2.56921959e+00 -2.32778549e+00
139    3.13423127e-01 -2.92683959e+00]
140  [-4.35501432e+00 -6.64530516e+00 -2.67742610e+00 -1.17845066e-01
141   -2.85258937e+00 -5.50431848e-01 -9.60213375e+00  7.24538279e+00
142   -3.44789529e+00  1.64255810e+00]
143  [-5.47394323e+00 -1.08092821e+00 -1.64719379e+00  9.85328484e+00
144   -7.05593157e+00  5.75078607e-01 -9.68547440e+00 -9.15001869e-01
145    1.65658855e+00 -1.85792297e-01]
146  [-7.80359316e+00 -9.22026825e+00  8.33270922e-02  2.77046013e+00
147   -5.15884924e+00 -4.86662388e+00 -1.67992954e+01  1.29672270e+01
148   -6.25996876e+00  5.20924866e-01]
149  [-6.59617043e+00  9.96817589e+00 -2.75709653e+00 -1.71209896e+00
150   -4.01448631e+00 -3.02791524e+00 -3.12719727e+00 -1.51730132e+00
151   -4.98004258e-04 -4.12086439e+00]
152  [-9.65253115e-01 -8.49990654e+00 -1.43093002e+00 -5.73353577e+00
153    7.69400740e+00 -2.64650655e+00  1.64753631e-01 -1.14152443e+00
154   -5.05820453e-01  7.64154613e-01]
155  [-1.84365034e+00 -3.49704790e+00  3.73679233e+00 -6.40503824e-01
156    2.95650196e+00 -4.03611994e+00 -2.31397927e-01 -8.25035051e-02
157   -1.48663497e+00 -3.70997334e+00]
158  [-1.00288069e+00 -1.03853207e+01 -5.00546408e+00 -2.16474271e+00
159    1.26804781e+00  3.68169332e+00 -4.15595388e+00  7.74603367e-01
160   -3.46848106e+00  1.61924720e+00]
161  [-8.06704104e-01 -4.91363621e+00 -2.35596347e+00 -2.85702825e+00
162    4.42464352e-01  3.48482227e+00  8.58158052e-01 -3.67168498e+00
163    5.69935441e-01 -1.76623583e+00]
164  [-1.29407346e+00 -9.36934757e+00 -2.53483844e+00 -7.41593695e+00
165    9.14549923e+00 -2.84342217e+00  4.87867653e-01  3.02054346e-01
166   -2.69292545e+00  6.10078335e-01]
167  [ 8.34772587e-01 -5.35130882e+00 -3.89060688e+00 -9.13158119e-01
168   -8.66082966e-01  1.73187017e+00 -1.27346182e+00 -2.35996619e-02
169   -1.58519483e+00 -1.29326522e-01]
170  [ 7.60240889e+00 -9.71819305e+00 -1.23689961e+00 -5.86197948e+00
171   -9.22325909e-01 -1.94832075e+00 -1.45736015e+00 -2.39852977e+00
172    5.59649408e-01 -1.65333962e+00]
173  [-4.68253469e+00  1.01933968e+00  7.72633743e+00  1.19260812e+00
174   -4.88958168e+00 -3.84424973e+00 -5.00262451e+00  6.23769760e-01
175   -1.18713427e+00 -7.60716343e+00]
176  [-1.51126087e+00 -1.05078020e+01 -3.32696509e+00 -3.26949739e+00
177    3.91099620e+00 -3.28158092e+00 -5.26946545e+00  9.08722103e-01
178   -6.48690581e-01  4.97121668e+00]
179  [-2.81226707e+00 -4.03853327e-01 -5.44990301e-01  6.37227345e+00
180   -4.12642241e+00 -6.29671574e-01 -5.03091955e+00 -1.67015207e+00
181    1.68188739e+00 -2.35136199e+00]
182  [-3.13361979e+00 -7.25620556e+00  7.22328997e+00  1.19694144e-01
183    1.87362361e+00 -3.83806610e+00 -2.67449141e+00 -7.33701408e-01
184   -5.54657221e-01 -2.11470747e+00]
185  [ 3.21995211e+00 -6.80552101e+00  7.95095682e-01 -2.34603792e-01
186   -9.08745348e-01 -3.12709546e+00 -2.79886174e+00 -1.66139257e+00
187    2.00422907e+00 -5.08270144e-01]
188  [-2.43184519e+00  1.19397068e+00 -8.56360018e-01  2.03929019e+00
189   -4.22333002e+00 -3.37325621e+00 -5.87407732e+00 -6.91816330e-01
190    5.83471584e+00 -2.12539864e+00]
191  [ 6.21714163e+00 -8.79817486e+00 -8.40128541e-01 -4.24246120e+00
192    3.65379810e-01 -2.06815696e+00 -2.54196525e-02 -3.41335535e+00
193    1.95059562e+00 -2.01700282e+00]
194  [-2.62992001e+00 -5.09876299e+00 -2.89182496e+00 -4.59942293e+00
195    9.14648533e+00 -3.61462283e+00  1.81836653e+00 -2.78515220e-01
196   -5.07630110e+00 -1.15271986e-01]
197  [-6.08962917e+00  8.60184765e+00 -3.60675478e+00 -1.21665525e+00
198   -2.37063217e+00 -2.80497265e+00 -2.68576789e+00 -2.11186290e+00
199    2.57670701e-01 -2.32078004e+00]
200  [-5.56191397e+00 -5.79510736e+00 -1.49829817e+00  5.05793810e-01
201   -2.89646697e+00 -3.64661694e+00 -1.08193960e+01  7.57458639e+00
202   -2.58382630e+00  2.28986621e+00]
203  [-2.76466340e-01 -2.79876041e+00 -1.54234338e+00 -4.76632118e+00
204    2.31388807e-01 -2.95408177e+00  5.59752417e+00 -3.21146131e+00
205   -6.23485565e-01 -4.71544075e+00]
206  [-4.30801582e+00 -9.20159817e+00 -4.03485155e+00  1.37167680e+00
207   -4.68464279e+00  1.07908945e+01 -5.12878227e+00 -5.52002764e+00
208   -2.50791001e+00 -2.33863807e+00]
209  [-2.39886975e+00 -1.11668122e+00  1.04440320e+00  5.81965351e+00
210   -2.90512037e+00 -7.93900609e-01 -4.91021633e+00 -8.02788734e-01
211    1.49466944e+00 -2.38970304e+00]
212  [-3.66068935e+00  6.11943722e-01 -3.15550542e+00 -4.46218538e+00
213    6.49231672e-01  2.12191999e-01 -2.82125616e+00 -1.31334174e+00
214    3.62595296e+00 -1.24214435e+00]
215  [-1.06356752e+00 -3.22593236e+00  7.30952406e+00  1.13280401e-01
216   -2.88806701e+00 -4.83450747e+00 -7.94957936e-01 -3.03373861e+00
217   -1.44983602e+00 -7.83700228e+00]
218  [-1.21801233e+00 -1.22286808e+00 -6.40458405e-01 -1.76100385e+00
219   -1.26282722e-01 -2.63104749e+00 -2.86514342e-01 -3.10595155e+00
220    4.95346117e+00 -4.48569441e+00]
221  [ 6.40396357e+00 -9.44966125e+00 -2.49628043e+00 -5.05750942e+00
222   -2.42264867e+00 -2.32418835e-01 -4.14300632e+00 -2.40346551e-01
223   -2.64638513e-02  2.42475599e-01]
224  [-5.76569939e+00  9.09585285e+00 -9.45633531e-01 -1.75654376e+00
225   -3.83029675e+00 -1.85127723e+00 -3.55841303e+00  9.73912776e-02
226   -2.27342904e-01 -5.07458925e+00]
227  [ 8.98472023e+00 -7.60436106e+00 -1.53448558e+00 -3.44894695e+00
228   -5.27945709e+00 -2.82014871e+00 -3.96959114e+00  6.38426304e-01
229   -1.93303084e+00 -1.15806365e+00]
230  [-2.69994473e+00 -5.72256684e-01 -1.42745101e+00  6.57258081e+00
231   -5.15864801e+00  1.44560802e+00 -5.70862150e+00 -1.28633332e+00
232    2.32377797e-01 -8.15576375e-01]
233  [-3.37817955e+00 -5.84361696e+00 -1.81542134e+00 -2.66591477e+00
234   -7.23389208e-01  3.04080367e+00 -5.49892044e+00  2.14065886e+00
235   -1.18896052e-01  5.78194439e-01]
236  [-6.12238693e+00 -5.99363518e+00 -5.33898401e+00  4.07270342e-01
237    2.21266580e+00 -1.98279941e+00 -7.69091034e+00  1.28964198e+00
238   -7.47547925e-01  7.94568968e+00]
239  [-1.28930390e+00 -7.00750923e+00 -3.26391292e+00 -9.87108052e-03
240   -1.31265759e+00  4.69801998e+00 -2.00067902e+00 -3.28277397e+00
241   -1.44913840e+00 -9.75884378e-01]
242  [ 2.54299831e+00 -4.78630447e+00  7.40122652e+00  1.65497994e+00
243   -4.82590055e+00 -4.09781504e+00 -4.28494453e+00 -3.24872279e+00
244   -8.72146368e-01 -6.91599083e+00]
245  [-6.32149792e+00 -6.61988592e+00 -5.63645661e-01  1.11525857e+00
246   -3.80714297e+00 -4.31970072e+00 -1.28044872e+01  9.62807083e+00
247   -4.41576862e+00  1.44692111e+00]
248  [ 8.20231247e+00 -9.28899097e+00 -2.71525472e-01 -3.72444439e+00
249   -2.31766248e+00 -2.37256622e+00 -1.35713160e+00 -2.72670174e+00
250    1.24776840e+00 -2.33394265e+00]
251  [-3.61968517e+00 -3.87293339e+00  7.96407282e-01  1.36682999e+00
252   -2.06595993e+00 -8.92679930e-01 -6.15059233e+00  5.80121613e+00
253   -4.00507307e+00 -1.06616330e+00]
254  [-5.19153118e+00 -6.67478895e+00 -5.59129190e+00 -2.17675877e+00
255    3.83141351e+00 -3.01570654e+00 -6.29968071e+00  1.73336005e+00
256   -1.43138528e+00  6.95552635e+00]
257  [-8.00816476e-01 -1.97534251e+00 -6.86942518e-01 -5.75412893e+00
258   -2.99803704e-01 -4.79223967e+00  8.83904266e+00 -2.96786737e+00
259   -6.78934145e+00 -7.96787500e+00]
260  [-2.26696754e+00 -2.60733390e+00 -1.34535456e+00 -2.59781456e+00
261   -3.96868527e-01 -1.88764584e+00 -2.20465422e+00 -3.17909288e+00
262    5.62986422e+00 -1.81405020e+00]
263  [-3.20301652e+00 -7.57690907e+00 -5.85079527e+00 -1.21830857e+00
264   -7.79941559e-01  8.39675045e+00 -1.58110631e+00 -4.61066437e+00
265   -2.66063380e+00 -1.52737403e+00]
266  [-4.79705048e+00  8.73511791e-01  1.69124293e+00  8.01379323e-01
267   -2.55320597e+00 -2.70672560e+00 -5.14369011e+00  4.49944639e+00
268   -8.77303243e-01 -2.90336895e+00]
269  [ 2.53867924e-01 -3.65151167e+00 -7.13096976e-01 -5.01382470e-01
270   -2.92767048e+00  3.31925130e+00 -1.39039588e+00 -2.78348660e+00
271    8.91939402e-01 -1.76978469e+00]
272  [-7.18259430e+00 -8.58575058e+00 -4.46029997e+00  1.21175937e-01
273   -2.18285799e+00 -2.82996964e+00 -1.38716888e+01  1.09992228e+01
274   -5.36351538e+00  2.86079264e+00]
275  [-1.86776495e+00 -8.92533112e+00 -1.11221468e+00 -6.86615705e+00
276    1.09167690e+01 -4.38365650e+00  3.05499363e+00 -7.38142848e-01
277   -4.72669363e+00 -1.16075397e+00]
278  [ 7.34524965e+00 -9.96361828e+00 -1.53066730e-02 -5.12100267e+00
279   -6.84436619e-01 -2.31362653e+00 -1.05270493e+00 -2.99627304e+00
280    9.61754024e-01 -2.46398997e+00]
281  [-3.64094507e-04 -3.21890140e+00 -1.45037913e+00  3.69917727e+00
282   -5.63264036e+00 -1.37363422e+00 -8.25037384e+00 -1.34430754e+00
283    5.51047707e+00  1.25459409e+00]
284  [-4.29434061e+00 -8.33967876e+00 -5.31020737e+00  4.50353146e-01
285    4.67013180e-01  1.92504421e-01 -8.64326286e+00  1.07248914e+00
286   -8.62734765e-02  6.88620949e+00]
287  [-5.06261349e+00  7.82321692e+00 -6.57718122e-01 -1.36327624e+00
288   -3.70516539e+00 -1.51302814e+00 -3.45879149e+00 -3.81949872e-01
289    1.05202578e-01 -4.38645267e+00]
290  [-3.45011425e+00  8.38857651e-01  1.47478032e+00 -3.45242262e+00
291   -2.51280880e+00 -2.05242348e+00  3.14104295e+00 -5.59536040e-01
292   -1.59673190e+00 -6.32680464e+00]
293  [-5.52827215e+00 -8.06639862e+00 -5.80005264e+00 -3.11182380e+00
294    4.72993135e+00 -2.04249454e+00 -6.19051456e+00  5.06243818e-02
295   -5.18482208e-01  7.44314003e+00]
296  [-7.11267042e+00  9.89037037e+00 -3.90442824e+00 -1.00292110e+00
297   -2.00449967e+00 -2.99454379e+00 -4.04968596e+00 -9.73127127e-01
298   -5.28770506e-01 -1.91362500e+00]
299  [ 1.36838242e-01 -6.91906214e+00 -1.52485931e+00 -4.67869329e+00
300   -3.16668898e-01 -5.84052801e-02 -1.63880575e+00 -6.03019667e+00
301    7.51271248e+00 -2.93813157e+00]
302  [ 6.01794672e+00 -9.79784203e+00 -1.67218637e+00 -1.76300597e+00
303   -3.49650502e+00 -5.21655560e-01 -6.04593086e+00  1.54522693e+00
304   -1.11886954e+00  5.83219349e-01]
305  [ 8.88364220e+00 -1.20231810e+01 -2.24906945e+00 -7.88442278e+00
306    2.21949935e-01 -2.78231096e+00 -1.53858614e+00 -4.06539536e+00
307    3.09143496e+00 -1.79866123e+00]
308  [-3.05259728e+00 -8.78616524e+00 -4.56072950e+00 -7.65971327e+00
309    1.23649864e+01 -4.62330198e+00  1.28370130e+00 -7.11889982e-01
310   -5.19386148e+00  1.63865781e+00]
311  [-5.88566494e+00 -6.74396801e+00 -7.48694849e+00 -6.41776323e-01
312    1.50634933e+00 -2.27251101e+00 -9.75167179e+00  2.58352757e+00
313   -3.96830857e-01  7.76620960e+00]
314  [-7.95795488e+00  9.55914688e+00 -4.17732191e+00 -1.31407106e+00
315   -1.46861434e+00 -4.14468145e+00 -3.69251156e+00 -1.35154915e+00
316   -5.48488855e-01 -1.84668946e+00]
317  [-5.95875788e+00 -5.90538311e+00 -2.54885268e+00 -9.96709764e-01
318    2.33633089e+00 -3.87184530e-01 -5.84520197e+00  1.55007470e+00
319   -1.84657717e+00  6.04645586e+00]
320  [-3.72540623e-01 -1.63343704e+00  5.03509939e-01  8.87222111e-01
321   -2.41230798e+00  1.28710938e+00 -1.46715665e+00 -3.64571571e+00
322    7.13795841e-01 -4.14003611e+00]
323  [-2.68255830e+00 -5.89093447e+00 -1.52046919e+00 -6.83910668e-01
324   -2.12034464e-01  8.47202659e-01 -4.80842209e+00  3.45603395e+00
325   -2.58179569e+00  1.68733501e+00]
326  [-4.40986300e+00 -2.40915507e-01  1.06261420e+00  6.33849144e+00
327   -4.16257334e+00 -1.77047348e+00 -6.77388334e+00  4.15884435e-01
328    1.72482109e+00 -2.29680467e+00]
329  [ 6.23313236e+00 -6.86183405e+00 -3.41004461e-01 -3.78087258e+00
330   -4.70880866e-01 -1.76971292e+00  4.65679228e-01 -2.68214107e+00
331    3.39652374e-02 -1.37045527e+00]
332  [-8.34015250e-01 -2.97809124e+00 -2.99976140e-01 -2.12902164e+00
333    2.46314436e-01 -7.22056270e-01  3.29645920e+00 -3.44532990e+00
334   -5.79032660e-01 -3.74988294e+00]
335  [-7.35883808e+00  9.41337776e+00 -4.47569799e+00 -2.65217781e+00
336   -1.20536995e+00 -3.11079574e+00 -2.41238952e+00 -2.13306212e+00
337    3.26139867e-01 -2.16879630e+00]
338  [ 1.17565651e+01 -1.16222315e+01 -7.50759602e-01 -6.15390158e+00
339   -4.86827755e+00 -3.02515316e+00 -3.54235864e+00 -1.72917056e+00
340   -6.89720273e-01 -2.61394286e+00]
341  [-6.82061434e+00  9.43066311e+00 -2.70753312e+00 -8.03832173e-01
342   -2.31512642e+00 -3.14683795e+00 -3.29978728e+00 -9.64430213e-01
343   -1.48874402e+00 -2.80748677e+00]
344  [ 3.52417901e-02 -3.83693147e+00  4.88774252e+00 -1.06256068e+00
345   -6.73442960e-01 -5.25066090e+00 -2.76347256e+00  1.51926529e+00
346   -2.50236988e+00 -2.44734383e+00]
347  [ 1.90826917e+00 -1.02698355e+01 -4.90185785e+00 -7.13517952e+00
348   -3.66843176e+00  6.61018658e+00 -6.54808187e+00  9.06733453e-01
349   -1.33572221e+00 -1.58266997e+00]
350  [-1.59806705e+00  5.12693524e-01 -3.96222091e+00 -2.97372907e-01
351   -3.58801270e+00  1.59314811e-01 -5.75293827e+00 -1.18524766e+00
352    5.25265646e+00  6.76914811e-01]
353  [ 2.84939408e-01 -3.72666717e+00 -9.87474322e-02 -1.24348295e+00
354   -1.12289953e+00  1.33937263e+00 -9.63008463e-01 -2.69848800e+00
355    1.45611620e+00 -2.29970026e+00]
356  [-1.30435586e+00 -9.91753960e+00 -5.55130184e-01 -6.56160212e+00
357    1.10248003e+01 -4.83570242e+00  2.39743471e+00 -2.32640058e-02
358   -5.08653355e+00 -8.35475981e-01]
359  [-4.46291876e+00 -1.66366923e+00 -2.40100527e+00  7.54964924e+00
360   -5.16322184e+00  1.25664580e+00 -7.37406158e+00 -4.36030030e-01
361    3.22779417e-01  4.63901937e-01]
362  [-7.41311979e+00  9.68668175e+00 -3.57693028e+00 -1.79707193e+00
363   -1.85976410e+00 -3.57132149e+00 -2.99357486e+00 -1.45108092e+00
364   -7.03378797e-01 -2.52111888e+00]
365  [-6.92787075e+00  8.91381454e+00 -3.77061844e+00 -2.22201777e+00
366   -1.58098698e+00 -3.34978390e+00 -2.31528354e+00 -2.00407672e+00
367    6.41008496e-01 -2.42776370e+00]
368  [-5.63030958e+00 -7.16164303e+00 -3.13681930e-01  7.09101856e-01
369   -2.99321437e+00 -4.03654957e+00 -1.22700424e+01  9.65640163e+00
370   -4.13436127e+00  7.79625297e-01]
371  [ 7.55145884e+00 -7.69227695e+00 -9.64319646e-01 -4.88572264e+00
372   -2.62384057e+00 -1.84793746e+00 -1.76602757e+00 -1.45980740e+00
373   -9.08782482e-01 -1.59664059e+00]
374  [-4.86237955e+00  3.91856837e+00  7.20119333e+00  1.32231820e+00
375   -5.17088604e+00 -3.63074875e+00 -5.97644281e+00  9.61266398e-01
376   -2.71759129e+00 -7.76719332e+00]
377  [-7.98956394e-01 -2.17634606e+00 -1.45666933e+00 -1.13282490e+00
378   -2.84885812e+00 -2.52794671e+00 -2.08882689e+00 -4.04120159e+00
379    7.97954464e+00 -3.63711405e+00]
380  [ 1.24917674e+00 -7.19460821e+00 -1.83539522e+00 -4.43933058e+00
381    3.52958977e-01  1.52172661e+00 -3.59291291e+00  2.47452784e+00
382   -2.33980775e+00 -5.21280169e-02]
383  [-8.10598564e+00  1.02367697e+01 -3.86531997e+00 -1.51181316e+00
384   -1.73392177e+00 -3.41944504e+00 -3.69718075e+00 -8.79215956e-01
385   -1.10280275e+00 -2.31143069e+00]
386  [-5.76149035e+00 -3.08287168e+00 -1.89743757e+00  4.70044327e+00
387   -1.51103759e+00  2.14207101e+00 -6.41087103e+00  2.08434790e-01
388   -2.57694274e-01  2.96276641e+00]
389  [ 3.47166151e-01  5.10207117e-01  5.76314986e-01  5.71353197e-01
390   -3.06604767e+00 -2.15022206e-01  1.15897608e+00 -2.41230011e+00
391   -2.15586877e+00 -4.49470472e+00]
392  [-3.55950046e+00 -5.49704981e+00 -4.17998552e+00 -3.66918373e+00
393    7.20040655e+00 -1.93672049e+00 -2.00535870e+00  7.23009765e-01
394   -2.14680696e+00  2.85113454e+00]
395  [-8.14102411e-01 -5.79159379e-01 -1.23053275e-01 -3.33355784e+00
396   -1.48242378e+00 -3.69326711e+00  6.11372566e+00 -4.43101740e+00
397   -1.31825018e+00 -7.66329432e+00]
398  [-5.16966867e+00 -8.15670872e+00 -7.77287245e-01 -2.20390391e+00
399    4.42545176e+00 -8.57552528e-01 -3.65839958e+00  1.47850239e+00
400   -2.55112362e+00  5.15741873e+00]
401  [ 7.62500858e+00 -3.19782257e+00 -2.43828133e-01 -2.12472630e+00
402   -5.53111601e+00 -1.81393397e+00 -3.54419500e-01 -1.65171635e+00
403   -3.11976862e+00 -3.81125307e+00]]

 

  得到损失值loss之后,使用Adam优化器对其进行优化,相关代码如下:

1 global_step = tf.Variable(0, trainable=False)
2 train_op = tf.train.AdamOptimizer(LEARNING_RATE).minimize(loss, global_step=global_step)

 

最后的完整代码如下:

  1 import tensorflow as tf
  2 import numpy as np
  3 from tensorflow.examples.tutorials.mnist import input_data
  4 from tensorflow.python import pywrap_tensorflow
  5 
  6 # set the parameter
  7 BATCH_SIZE = 100
  8 INPUT_NODE = 784
  9 OUTPUT_NODE = 10
 10 LAYER1_NODE = 50
 11 LAYER2_NODE = 100
 12 TRAIN_STEP = 10000
 13 LEARNING_RATE = 0.01
 14 
 15 # define input placeholder
 16 input_x = tf.placeholder(tf.float32, [None, INPUT_NODE], name="input_x")
 17 input_y = tf.placeholder(tf.float32, [None, OUTPUT_NODE], name="input_y")
 18 
 19 # define weights and biases
 20 Layer1_w = tf.Variable(tf.truncated_normal(shape=[INPUT_NODE, LAYER1_NODE], stddev=0.1))
 21 Layer1_b = tf.Variable(tf.constant(0.1, shape=[LAYER1_NODE]))
 22 
 23 Layer2_w = tf.Variable(tf.truncated_normal(shape=[LAYER1_NODE, LAYER2_NODE], stddev=0.1))
 24 Layer2_b = tf.Variable(tf.constant(0.1, shape=[LAYER2_NODE]))
 25 
 26 Layer3_w = tf.Variable(tf.truncated_normal(shape=[LAYER2_NODE, OUTPUT_NODE], stddev=0.1))
 27 Layer3_b = tf.Variable(tf.constant(0.1, shape=[OUTPUT_NODE]))
 28 
 29 
 30 def Network(x, w1, b1, w2, b2, w3, b3):
 31     layer1 = tf.nn.relu(tf.nn.xw_plus_b(x, w1, b1))
 32     layer2 = tf.nn.relu(tf.nn.xw_plus_b(layer1, w2, b2))
 33     pred = tf.nn.xw_plus_b(layer2, w3, b3)
 34     print(" Network is ready!")
 35     return pred
 36 
 37 
 38 y_pred = Network(input_x, Layer1_w, Layer1_b, Layer2_w, Layer2_b, Layer3_w, Layer3_b)
 39 
 40 # define loss
 41 cross_entropy = tf.nn.softmax_cross_entropy_with_logits(logits=y_pred, labels=input_y)  # cross entropy loss
 42 loss = tf.reduce_mean(cross_entropy)
 43 print("Loss is ready!")
 44 
 45 # define accuracy
 46 correct_predictions = tf.equal(tf.argmax(y_pred, 1), tf.argmax(input_y, 1))  # if both are equal, return true.Or false
 47 accuracy = tf.reduce_mean(tf.cast(correct_predictions, tf.float32))  # traonsform bool to float32 and calculate average
 48 print("Accuracy is ready!")
 49 
 50 # train operation
 51 global_step = tf.Variable(0, trainable=False)
 52 train_op = tf.train.AdamOptimizer(LEARNING_RATE).minimize(loss, global_step=global_step)
 53 print("Train operation is ready!")
 54 
 55 # define save the model
 56 saver = tf.train.Saver()
 57 
 58 
 59 def train(mnist):
 60     with tf.Session() as sess:
 61         # $ tensorboard --logdir=logs
 62         # http://0.0.0.0:6006/
 63         # tf.train.SummaryWriter soon be deprecated, use following
 64         tf.summary.FileWriter("logs/", sess.graph)
 65 
 66         tf.global_variables_initializer().run()
 67         print("Initiality is ready!")
 68 
 69         for i in range(TRAIN_STEP):
 70             xs, ys = mnist.train.next_batch(BATCH_SIZE)
 71             # print("xs shape:{}".format(np.shape(xs)))
 72             # print("ys shape:{}".format(np.shape(ys)))
 73             # print("ys:", ys)
 74 
 75             feed_dict = {
 76                 input_x: xs,
 77                 input_y: ys
 78             }
 79 
 80             _, step, train_loss, train_acc = sess.run([train_op, global_step, loss, accuracy], feed_dict=feed_dict)
 81             if (i % 100 == 0):
 82                 print("After %d steps, in train data, loss is %g, accuracy is %g." % (step, train_loss, train_acc))
 83             # print("y_pred shape:", y_pred_out.shape)
 84             # print("y_pred value:", y_pred_out)
 85 
 86         test_feed = {input_x: mnist.test.images, input_y: mnist.test.labels}
 87         test_acc = sess.run(accuracy, feed_dict=test_feed)
 88         print("After %d steps, in test data, accuracy is %g." % (TRAIN_STEP, test_acc))
 89         saver.save(sess, ./model/my_model.ckpt- + str(i+1))  # save the trained model
 90 
 91 
 92 if __name__ == "__main__":
 93     mnist = input_data.read_data_sets("./mnist/", one_hot=True)  # read mnist in current path
 94     print("MNIST is ready!")
 95     train(mnist)
 96 
 97     # produce the weight and bias in saved model
 98     # model_reader = pywrap_tensorflow.NewCheckpointReader(‘./model/my_model.ckpt-9999‘)
 99     # var_dict = model_reader.get_variable_to_shape_map()
100     # for key in var_dict:
101     #     print("variable name: ", key)
102     #     print(model_reader.get_tensor(key))

 

以上是关于tensorflow完成mnist实验的处理分析总结的主要内容,如果未能解决你的问题,请参考以下文章

分析tensorflow mnist

你如何为 MNIST “导入”图像数据?

TensorFlow——LSTM长短期记忆神经网络处理Mnist数据集

基于 Mindspore 框架与 ModelArts 平台的 MNIST 手写体识别实验

求助Tensorflow下跑mnist手写体数据集遇到Cuda compute capability问题

记录二:tensorflow2.0写MNIST手写体