使用 Tensorboard 实时监控训练并可视化模型架构
Posted
技术标签:
【中文标题】使用 Tensorboard 实时监控训练并可视化模型架构【英文标题】:Using Tensorboard to monitor training real time and visualize the model architecture 【发布时间】:2020-01-26 15:19:49 【问题描述】:我正在学习使用 Tensorboard -- Tensorflow 2.0。
特别是,我想实时监控学习曲线,并直观地检查和交流我的模型的架构。
下面我将提供一个可重现示例的代码。
我有三个问题:
虽然训练结束后我得到了学习曲线,但我不知道应该怎么做才能实时监控它们
我从 Tensorboard 得到的学习曲线与 history.history 的情节不符。事实上,它的反转很奇怪且难以解释。
我无法理解图表。我已经训练了一个顺序模型,其中包含 5 个密集层和中间的 dropout 层。 Tensorboard 向我展示的是其中包含更多元素的东西。
我的代码如下:
from keras.datasets import boston_housing
(train_data, train_targets), (test_data, test_targets) = boston_housing.load_data()
inputs = Input(shape = (train_data.shape[1], ))
x1 = Dense(100, kernel_initializer = 'he_normal', activation = 'elu')(inputs)
x1a = Dropout(0.5)(x1)
x2 = Dense(100, kernel_initializer = 'he_normal', activation = 'elu')(x1a)
x2a = Dropout(0.5)(x2)
x3 = Dense(100, kernel_initializer = 'he_normal', activation = 'elu')(x2a)
x3a = Dropout(0.5)(x3)
x4 = Dense(100, kernel_initializer = 'he_normal', activation = 'elu')(x3a)
x4a = Dropout(0.5)(x4)
x5 = Dense(100, kernel_initializer = 'he_normal', activation = 'elu')(x4a)
predictions = Dense(1)(x5)
model = Model(inputs = inputs, outputs = predictions)
model.compile(optimizer = 'Adam', loss = 'mse')
logdir="logs\\fit\\" + datetime.now().strftime("%Y%m%d-%H%M%S")
tensorboard_callback = keras.callbacks.TensorBoard(log_dir=logdir)
history = model.fit(train_data, train_targets,
batch_size= 32,
epochs= 20,
validation_data=(test_data, test_targets),
shuffle=True,
callbacks=[tensorboard_callback ])
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.plot(history.history['val_loss'])
【问题讨论】:
【参考方案1】:我认为您可以做的是在您的模型上调用 .fit()
之前启动 TensorBoard。如果您使用的是 IPython(Jupyter 或 Colab),并且已经安装了 TensorBoard,那么您可以通过以下方式修改代码;
from keras.datasets import boston_housing
(train_data, train_targets), (test_data, test_targets) = boston_housing.load_data()
inputs = Input(shape = (train_data.shape[1], ))
x1 = Dense(100, kernel_initializer = 'he_normal', activation = 'relu')(inputs)
x1a = Dropout(0.5)(x1)
x2 = Dense(100, kernel_initializer = 'he_normal', activation = 'relu')(x1a)
x2a = Dropout(0.5)(x2)
x3 = Dense(100, kernel_initializer = 'he_normal', activation = 'relu')(x2a)
x3a = Dropout(0.5)(x3)
x4 = Dense(100, kernel_initializer = 'he_normal', activation = 'relu')(x3a)
x4a = Dropout(0.5)(x4)
x5 = Dense(100, kernel_initializer = 'he_normal', activation = 'relu')(x4a)
predictions = Dense(1)(x5)
model = Model(inputs = inputs, outputs = predictions)
model.compile(optimizer = 'Adam', loss = 'mse')
logdir="logs\\fit\\" + datetime.now().strftime("%Y%m%d-%H%M%S")
tensorboard_callback = keras.callbacks.TensorBoard(log_dir=logdir)
在另一个单元格中,你可以运行;
# Magic func to use TensorBoard directly in IPython
%load_ext tensorboard
通过在另一个单元格中运行来启动 TensorBoard;
# Launch TensorBoard with objects in the log directory
# This should launch tensorboard in your browser, but you may not see your metadata.
%tensorboard --logdir=logdir
您终于可以在另一个单元格中的模型上调用.fit()
;
history = model.fit(train_data, train_targets,
batch_size= 32,
epochs= 20,
validation_data=(test_data, test_targets),
shuffle=True,
callbacks=[tensorboard_callback ])
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
如果您不使用 IPython,您可能只需要在训练模型期间或之前启动 TensorBoard 以实时监控它。
【讨论】:
以上是关于使用 Tensorboard 实时监控训练并可视化模型架构的主要内容,如果未能解决你的问题,请参考以下文章
Tensorboard 训练分类算法的tensorboard可视化示例