吴恩达深度学习第2课第2周编程作业 的坑()

Posted Spurs

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了吴恩达深度学习第2课第2周编程作业 的坑()相关的知识,希望对你有一定的参考价值。

def initialize_parameters(layer_dims):
    """
    Arguments:
    layer_dims -- python array (list) containing the dimensions of each layer in our network
    
    Returns:
    parameters -- python dictionary containing your parameters "W1", "b1", ..., "WL", "bL":
                    W1 -- weight matrix of shape (layer_dims[l], layer_dims[l-1])
                    b1 -- bias vector of shape (layer_dims[l], 1)
                    Wl -- weight matrix of shape (layer_dims[l-1], layer_dims[l])
                    bl -- bias vector of shape (1, layer_dims[l])
                    
    Tips:
    - For example: the layer_dims for the "Planar Data classification model" would have been [2,2,1]. 
    This means W1‘s shape was (2,2), b1 was (1,2), W2 was (2,1) and b2 was (1,1). Now you have to generalize it!
    - In the for loop, use parameters[‘W‘ + str(l)] to access Wl, where l is the iterative integer.
    """
    
    np.random.seed(3)
    parameters = {}
    L = len(layer_dims) # number of layers in the network

    for l in range(1, L):
        parameters[‘W‘ + str(l)] = np.random.randn(layer_dims[l], layer_dims[l-1])*  np.sqrt(2.0 / layer_dims[l-1]) # <------- 坑在这, 原来是2, 我们改成2.0了
        parameters[‘b‘ + str(l)] = np.zeros((layer_dims[l], 1))
        
        assert(parameters[‘W‘ + str(l)].shape == layer_dims[l], layer_dims[l-1])
        assert(parameters[‘W‘ + str(l)].shape == layer_dims[l], 1)
        
    return parameters

以上是关于吴恩达深度学习第2课第2周编程作业 的坑()的主要内容,如果未能解决你的问题,请参考以下文章

吴恩达-医学图像人工智能专项课程-第一课第一周6-10节总结+作业解读

吴恩达深度学习编程作业(1-1):Logistic Regression with a Neural Network mindset

吴恩达-第一课第二周1-7节总结-医学深度学习模型的评估汇总

吴恩达 深度学习笔记+作业

吴恩达《深度学习》第一门课神经网络的编程基础

吴恩达深度学习课程第一课 — 神经网络与深度学习 — 第一周练习