Improving DNNs Hyperparameter tuning-Regularization and Optimization(week2)Regularization

Posted douzujun

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了Improving DNNs Hyperparameter tuning-Regularization and Optimization(week2)Regularization相关的知识,希望对你有一定的参考价值。

Regularization

Deep Learning models have so much flexibility and capacity that overfitting can be a serious problem,if the training dataset is not big enough. Sure it does well on the training set, but the learned network doesn‘t generalize to new examples that it has never seen!(它在训练集上工作很好,但是不能用于 它从未见过的新样例)

You will learn to: Use regularization in your deep learning models.

Let‘s first import the packages you are going to use.

# import packages
import numpy as np
import matplotlib.pyplot as plt
from reg_utils import sigmoid, relu, plot_decision_boundary, initialize_parameters, load_2D_dataset, predict_dec
from reg_utils import compute_cost, predict, forward_propagation, backward_propagation, update_parameters
import sklearn
import sklearn.datasets
import scipy.io
from testCases import *

%matplotlib inline
plt.rcParams[‘figure.figsize‘] = (7.0, 4.0) # set default size of plots
plt.rcParams[‘image.interpolation‘] = ‘nearest‘
plt.rcParams[‘image.cmap‘] = ‘gray‘

以上是关于Improving DNNs Hyperparameter tuning-Regularization and Optimization(week2)Regularization的主要内容,如果未能解决你的问题,请参考以下文章

Improving DNNs Hyperparameter tuning-Regularization and Optimization(week1)GradientChecking

Improving DNNs Hyperparameter tuning-Regularization and Optimization(week2)Optimization Methods(示例代码

IMPROVING ADVERSARIAL ROBUSTNESS REQUIRES REVISITING MISCLASSIFIED EXAMPLES

Magnum Improving Solution

Improving the quality of the output

Consideration about improving mathematics study