tensorflow-chp04

Posted rongyongfeikai2

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了tensorflow-chp04相关的知识,希望对你有一定的参考价值。

#coding:utf-8
import tensorflow as tf

if __name__ == '__main__':
    (x,y),(x_test,y_test) = tf.keras.datasets.mnist.load_data()
    x = tf.convert_to_tensor(x, dtype=tf.float32)/255.
    y = tf.convert_to_tensor(y, dtype=tf.int32)
    y_onehot = tf.one_hot(y, depth=10)
    x = tf.reshape(x, [-1,28*28])
    lr = 0.001

    w1 = tf.Variable(tf.random.truncated_normal([784,256],stddev=0.1))
    b1 = tf.Variable(tf.zeros([256]))
    w2 = tf.Variable(tf.random.truncated_normal([256,128],stddev=0.1))
    b2 = tf.Variable(tf.zeros([128]))
    w3 = tf.Variable(tf.random.truncated_normal([128,10],stddev=0.1))
    b3 = tf.Variable(tf.zeros([10]))

    for epoch in range(0, 500):
        with tf.GradientTape() as tape:
            h1 = x@w1 + b1
            h1 = tf.nn.relu(h1)
            h2 = h1@w2 + b2
            h2 = tf.nn.relu(h2)
            out = h2@w3 + b3

            loss = tf.square(y_onehot-out)
            loss = tf.reduce_mean(loss)

            print("epch:" + str(epoch) + "loss: " + str(loss.numpy()))

            grads = tape.gradient(loss, [w1,b1,w2,b2,w3,b3])
        w1.assign_sub(lr*grads[0])
        b1.assign_sub(lr*grads[1])
        w2.assign_sub(lr*grads[2])
        b2.assign_sub(lr*grads[3])
        w3.assign_sub(lr*grads[4])
        b3.assign_sub(lr*grads[5])

以上是关于tensorflow-chp04的主要内容,如果未能解决你的问题,请参考以下文章

tensorflow-chp06

tensorflow -----AttributeError: module ‘tensorflo

Windows下Pycharm安装Tensorflow:ERROR: Could not find a version that satisfies the requirement tensorflo

保存一个“微调”的伯特模型

在 Ubuntu 上安装 TensorFlow (官方文档的翻译)

TensorBoard使用