4.keras-交叉熵的介绍和应用
Posted wigginess
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了4.keras-交叉熵的介绍和应用相关的知识,希望对你有一定的参考价值。
keras-交叉熵的介绍和应用
1.载入数据以及预处理
import numpy as np
from keras.datasets import mnist
from keras.utils import np_utils
from keras.models import Sequential
from keras.layers import *
from keras.optimizers import SGD
import os
import tensorflow as tf
# 载入数据
(x_train,y_train),(x_test,y_test) = mnist.load_data()
# 预处理
# 将(60000,28,28)转化为(600000,784),好输入展开层
x_train = x_train.reshape(x_train.shape[0],-1)/255.0
x_test= x_test.reshape(x_test.shape[0],-1)/255.0
# 将输出转化为one_hot编码
y_train = np_utils.to_categorical(y_train,num_classes=10)
y_test = np_utils.to_categorical(y_test,num_classes=10)
2.创建网络打印训练结果
# 创建网络
model = Sequential([
# 输入784输出10个
Dense(units=10,input_dim=784,bias_initializer=‘one‘,activation=‘softmax‘)
])
# 编译
# 自定义优化器
sgd = SGD(lr=0.1)
model.compile(optimizer=sgd,
# 运用交叉熵
loss=‘categorical_crossentropy‘,
metrics=[‘accuracy‘])
model.fit(x_train,y_train,batch_size=32,epochs=10,validation_split=0.2)
# 评估模型
loss,acc = model.evaluate(x_test,y_test,)
print(‘
test loss‘,loss)
print(‘test acc‘,acc)
out:
Epoch 1/10
32/48000 [..............................] - ETA: 2:43 - loss: 2.2593 - acc: 0.1562
1792/48000 [>.............................] - ETA: 4s - loss: 1.2642 - acc: 0.6579
......
......
Epoch 10/10
47456/48000 [============================>.] - ETA: 0s - loss: 0.2712 - acc: 0.9241
48000/48000 [==============================] - 2s 41us/step - loss: 0.2716 - acc: 0.9240 - val_loss: 0.2748 - val_acc: 0.9240
32/10000 [..............................] - ETA: 0s
2976/10000 [=======>......................] - ETA: 0s
6656/10000 [==================>...........] - ETA: 0s
10000/10000 [==============================] - 0s 17us/step
test loss 0.2802182431191206
test acc 0.9205
以上是关于4.keras-交叉熵的介绍和应用的主要内容,如果未能解决你的问题,请参考以下文章
详解pytorch中的交叉熵损失函数nn.BCELoss()nn.BCELossWithLogits(),二分类任务如何定义损失函数,如何计算准确率如何预测