TensorFlow 神经网络只预测 1 作为结果

Posted

技术标签:

【中文标题】TensorFlow 神经网络只预测 1 作为结果【英文标题】:Tensorflow neural network only predicting 1 as result 【发布时间】:2018-06-28 22:12:18 【问题描述】:

由于某种原因,在对数据集进行训练后,以下代码仅预测任何输入的 1。请帮忙。已经尝试过降低学习率,分批训练(甚至一次一个),不使用初始权重为零,使用reduce_mean等。

xtf = tf.placeholder(tf.float32, [None,7])
w = tf.Variable(tf.truncated_normal([7,1], stddev=1./math.sqrt(7)))
b = tf.Variable(tf.zeros([1]))
ytf = tf.nn.softmax(tf.matmul(xtf, w) + b)
y_ = tf.placeholder(tf.float32, [None,1])
cross_entropy = tf.reduce_mean(-tf.reduce_sum(y_ * tf.log(ytf),reduction_indices =[1]))
train_step = tf.train.GradientDescentOptimizer(0.001).minimize(cross_entropy)
sess = tf.InteractiveSession()
tf.global_variables_initializer().run()
ptr=0
batch = 1
x,y,xt = data()
while (ptr+100) <=(len(x)-1):
    x_batch, y_batch = x[ptr:ptr+100], y[ptr:ptr+100]
    sess.run(train_step, feed_dict=xtf: x_batch, y_: y_batch)
    ptr += 100
predict = sess.run(ytf, feed_dict=xtf: xt)

XTF = [[0.2711736617240513,0.014151057562208049,0.0,0.125,1.0,1.0,0.0],[0.4722292033174164,0.13913573538264068,0.0,0.125,0.0,0.0,1.0],[0.32143754712239253,0.015468569817999833,0.0,0.0,1.0,1.0 ,1.0],[0.43453128926866047,0.10364429745562033,0.0,0.125,1.0,0.0,1.0],[0.43453128926866047,0.015712553569072387,0.0,0.0,1.0,1.0,0.0],[0.3465694898215632,0.01650950209357577,0.0,0.0,0.5,1.0,0.0 ],[0.6732847449107816,0.10122885832000206,0.0,0.0,1.0,0.0,0.0],[0.01985423473234481,0.04113566043083236,0.16666666666666666,0.375,1.0,1.0,0.0],[0.33400351847197784,0.021730754366528396,0.3333333333333333,0.0,1.0,1.0,1.0], [0.17064589092736868,0.058694292654020104,0.0,0.125,0.0,0.5,1.0],[0.04498617743151546,0.03259622914329302,0.16666666666666666,0.125,1.0,1.0,1.0],[0.7235486303091229,0.051822148727810165,0.0,0.0,1.0,0.0,1.0],[0.24604171902488062 , 0.015712553569072387, 0.0, 0.0, 1.0, 1.0, 0.0], [0.4847951746670017, 0.061044734 51835265,0.8333333333333334,0.125,1.0,1.0,0.0],[0.17064589092736868,0.015330377421392339,0.0,0.0,1.0,1.0,1.0],[0.6858507162603669,0.03122992013728673,0.0,0.0,1.0,0.5,1.0],[0.01985423473234481,0.056848213999904744, 0.16666666666666666,0.5%,0.5%,1.0,0.0],[0.3465694898215632,0.025374310111545468,0.0,0.0,1.0,0.5,0.0],[0.38426740387031916,0.03513366015444757,0.0,0.125,1.0,1.0,1.0],[0.3465694898215632,0.014102260811993537,0.0, 0.0,0.0,1.0,1.0],[0.43453128926866047,0.050748620223090936,0.0,0.0,1.0,0.5,0.0],[0.42196531791907516,0.025374310111545468,0.0,0.0,1.0,0.5,0.0],[0.18321186227695402,0.015671954672893913,0.0,0.0, 0.5,1.0,1.0],[0.3465694898215632,0.06929138530460492,0.0,0.0,1.0,0.0,0.0],[0.09525006282985675,0.04113566043083236,0.16666666666666666,0.375,1.0,1.0,1.0],[0.4722292033174164,0.061264319894317944,0.8333333333333334,0.125,1.0, 1.0, 1.0], [0.3465694898215632, 0.014102260811993537, 0.0, 0.0, 0.0, 1.0, 0.0],[0.23347574767529528,0.5133418122566505,0.3333333333333333,0.375,1.0,0.0,0.0],[0.3465694898215632,0.01537917417160685,0.0,0.0,0.5,1.0,1.0],[0.3465694898215632,0.015411575213749284,0.0,0.0,1.0,1.0,0.0] [0.4973611460165871,0.05410739813385612,0.0,0.0,0.0,0.0,0.0],[0.3465694898215632,0.2859895551532101,0.0,0.125,0.0,0.0,1.0],[0.3465694898215632,0.015126992566498259,0.0,0.0,0.5,1.0,1.0],[ 0.8240764011058055,0.020494635090094415,0.0,0.0,1.0,0.5,0.0],[0.3465694898215632,0.16038672010106,0.0,0.125,0.0,0.0,0.0],[0.5224930887157577,0.10149724044618187,0.0,0.125,1.0,0.0,0.0],[0.3465694898215632, 0.014110458666029575,0.0,0.0,0.0,1.0,0.0],[0.2586076903744659,0.015712553569072387,0.0,0.0,1.0,1.0,0.0],[0.22090977632570996,0.03513366015444757,0.0,0.25,1.0,1.0,1.0],[0.17064589092736868,0.021942337075458514, 0.0, 0.125, 0.0, 1.0, 1.0], [0.4973611460165871, 0.018493968331299484, 0.0, 0.125, 1.0, 1.0, 1.0], [0.334003 51847197784,0.04098927018018883,0.0,0.125,1.0,0.5,1.0],[0.3465694898215632,0.015411575213749284,0.0,0.0,0.0,1.0,0.0],[0.032420206081930136,0.08115719346076702,0.3333333333333333,0.125,0.0,0.5,1.0],[0.23347574767529528, 0.01537917417160685,0.0,0.0,0.5,1.0,1.0],[0.3465694898215632,0.015712553569072387,0.0,0.0,1.0,1.0,0.0],[0.3465694898215632,0.030253985132996517,0.0,0.125,0.5,1.0,0.0],[0.3465694898215632,0.015126992566498259, 0.0,0.0,0.5,1.0,1.0],[0.3465694898215632,0.04231498029001666,0.0,0.25,0.0,1.0,0.0],[0.22090977632570996,0.034743286152731485,0.0,0.125,1.0,1.0,1.0],[0.08268409148027142,0.07746484096553544,0.16666666666666666, 0.5,1.0,1.0,0.0],[0.2586076903744659,0.01522458606692728,0.0,0.0,1.0,1.0,0.0],[0.610454888162855,0.1497654242623688,0.0,0.125,0.0,0.0,1.0],[0.35913546117114853,0.050748620223090936,0.0,0.125, 1.0, 0.5, 1.0], [0.8115104297562201, 0.1209753416358076, 0.16666666666666666, 0.0, 0.0 ,0.0,0.0],[0.3465694898215632,0.06929138530460492,0.0,0.0,1.0,0.0,0.0],[0.2586076903744659,0.020494635090094415,0.0,0.0,1.0,0.5,1.0],[0.35285247549635584,0.014110458666029575,0.0,0.0,0.0,1.0 ,0.0],[0.05755214878110078,0.05416439273810667,0.3333333333333333,0.125,1.0,0.5,1.0],[0.13294797687861273,0.09154270340242172,0.3333333333333333,0.625,1.0,1.0,0.0],[0.2711736617240513,0.014110458666029575,0.0,0.0,0.0,1.0,0.0 ],[0.4722292033174164,0.15614960068643363,0.0,0.0,1.0,0.0,1.0],[0.5601910027645137,0.16293234896625058,0.0,0.125,1.0,0.0,0.0],[0.04498617743151546,0.05445717323939373,0.3333333333333333,0.375,1.0,1.0,0.0], [0.3465694898215632,0.05410739813385612,0.0,0.0,0.0,0.0,0.0],[0.3465694898215632,0.029757819776815374,0.16666666666666666,0.125,0.0,1.0,0.0],[0.35913546117114853,0.020494635090094415,0.0,0.0,1.0,0.5,1.0],[0.23347574767529528 , 0.015923941091001644, 0.0, 0.0, 1.0, 1.0, 0.0], [0.20834380497612465 ,0.015468569817999833,0.3333333333333333,0.5,1.0,1.0,1.0],[0.32143754712239253,0.016908073949327893,0.0,0.25,1.0,1.0,0.0],[0.39683337521990447,0.020494635090094415,0.0,0.0,1.0,0.5,0.0],[0.19577783362653933,0.09154270340242172 ,0.3333333333333333,0.625,1.0,1.0,1.0],[0.2586076903744659,0.1434624456306609,0.0,0.0,1.0,0.5,0.0],[0.32143754712239253,0.028212719478023115,0.0,0.125,0.0,1.0,0.0],[0.39683337521990447,0.11027245763075773,0.0 ,0.0,1.0,1.0,0.0],[0.3088715757728072,0.014931805565640218,0.0,0.0,1.0,1.0,0.0],[0.3465694898215632,0.015411575213749284,0.0,0.0,1.0,1.0,0.0],[0.3465694898215632,0.015712553569072387,0.0,0.0 ,1.0,1.0,0.0],[0.005152048253329982,0.056604230248832196,0.3333333333333333,0.0,1.0,0.5,0.0],[0.37170143252073384,0.024349578357040744,0.0,0.0,1.0,1.0,1.0],[0.2711736617240513,0.017566830077223785,0.0,0.0,1.0 , 1.0, 0.0], [0.35913546117114853, 0.018542765081513996, 0.0, 0.0, 1.0, 1.0, 0.0] [0.3465694898215632,0.015200187691820024,0.0,0.0,0.5,1.0,1.0],[0.3465694898215632,0.09193307740413781,0.0,0.0,1.0,0.0,0.0],[0.20834380497612465,0.020494635090094415,0.0,0.0,1.0,0.5,1.0],[ 0.4093993465694898,0.030937139635999665,0.0,0.375,1.0,1.0,1.0],[0.19577783362653933,0.06709553154495196,0.5%,0.125,1.0,1.0,0.0],[0.3465694898215632,0.015712553569072387,0.0,0.0,1.0,1.0,0.0],[0.2837396330736366, 0.5133418122566505,0.3333333333333333,0.375,1.0,0.0,1.0],[0.2963056044232219,0.015712553569072387,0.0,0.0,1.0,1.0,0.0],[0.35913546117114853,0.015712553569072387,0.0,0.0,1.0,1.0,0.0],[0.24604171902488062,0.015330377421392339, 0.0,0.0,1.0,1.0,0.0],[0.572756974114099,0.11940564777490723,0.0,0.125,1.0,0.0,0.0],[0.32143754712239253,0.04015972542654215,0.3333333333333333,0.125,1.0,1.0,0.0],[0.7361146016587082,0.014151057562208049,0.0, 0.0, 1.0, 1.0, 0.0], [0.3465694898215632, 0.015712553569072387, 0.0, 0.0, 1.0, 1.0, 0.0],[0.8869062578537321,0.06764049365134761,0.0,0.0,0.0,0.0,0.0],[0.2837396330736366,0.12366716556464086,0.16666666666666666,0.0,0.0,0.0,0.0],[0.42196531791907516,0.04489301019734967,0.16666666666666666,0.0,1.0,0.5,1.0] ,[0.42196531791907516,0.050748620223090936,0.0,0.0,0.125,1.0,0.5,0.0],[0.015411575215632,0.015411575215632,0.015411575213749284,0.0.0.0,0.0,0.0,0.0,0.0,10.0,1.0,10],[0.34656948915632,0.015411575215632,0.015411575213749284,0.0,0.0,1.0,1.0,0.0],[ 0.2586076903744659,0.15085515328815924,0.16666666666666666,0.0,1.0,0.0,0.0],[0.4093993465694898,0.016891873428256675,0.0,0.0,1.0,1.0,0.0],[0.4596632319678311,0.015468569817999833,0.0,0.25,1.0,1.0,0.0],[0.3465694898215632, 0.015411575213749284,0.0,0.0,1.0,1.0,0.0],[0.2586076903744659,0.014931805565640218,0.0,0.0,1.0,1.0,1.0],[0.3465694898215632,0.015175789316712771,0.0,0.0,1.0,1.0,0.0],[0.4722292033174164,0.015411575213749284, 0.0, 0.0, 1.0, 1.0, 0.0], [0.3465694898215632, 0.04713766070721715, 0.0, 0 0.125,0.5,1.0,1.0],[0.5853229454636844,0.10149724044618187,0.0,0.0,1.0,0.0,0.0],[0.17692887660216136,0.028212719478023115,0.0,0.125,0.0,1.0,1.0],[0.2711736617240513,0.015712553569072387,0.0,0.0 ,1.0,1.0,0.0],[0.24604171902488062,0.019177122834302632,0.0,0.125,1.0,1.0,1.0],[0.20834380497612465,0.028220722145058292,0.0,0.0,0.0,1.0,1.0],[0.2586076903744659,0.015468569817999833,0.0,0.0,1.0 ,1.0,0.0],[0.8806232721789394,0.015126992566498259,0.0,0.0,0.5,1.0,0.0],[0.35913546117114853,0.04098927018018883,0.0,0.125,1.0,0.5,0.0],[0.2963056044232219,0.4831284260198326,0.16666666666666666,0.0,0.0,0.0 ,0.0],[0.01985423473234481,0.06104473451835265,0.3333333333333333,0.5,1.0,1.0,1.0],[0.2586076903744659,0.1434624456306609,0.0,0.25,1.0,0.5,0.0],[0.3465694898215632,0.015712553569072387,0.0,0.0,1.0,1.0,0.0 ], [0.40311636089469716, 0.058694292654020104, 0.0, 0.125, 0.0, 0.5, 0.0], [0.40311636089469716, 0.025374310111545468 ,0.0,0.0,1.0,0.5,1.0],[0.6732847449107816,0.15085515328815924,0.16666666666666666,0.0,1.0,0.0,0.0],[0.14551394822819805,0.021942337075458514,0.0,0.125,0.0,1.0,0.0],[0.3465694898215632,0.015126992566498259,0.0 ,0.0,0.5,1.0,0.0],[0.2963056044232219,0.01393967004027879,0.0,0.0,1.0,1.0,0.0],[0.3465694898215632,0.04364049521284361,0.16666666666666666,0.125,0.0,1.0,1.0],[0.5601910027645137,0.013614293309848433,0.0,0.0 ,1.0,1.0,0.0],[0.4093993465694898,0.015411575213749284,0.0,0.0,0.0,1.0,0.0],[0.24604171902488062,0.013760683560491965,0.0,0.0,1.0,1.0,0.0],[0.5853229454636844,0.028302115124416098,0.0,0.125,1.0 ,1.0,1.0],[0.35913546117114853,0.050748620223090936,0.0,0.125,1.0,0.5,1.0],[0.3088715757728072,0.025374310111545468,0.0,0.0,1.0,0.5,0.0],[0.2837396330736366,0.02936744577509929,0.0,0.0,0.0,0.5 , 0.0], [0.23347574767529528, 0.051301584996521765, 0.3333333333333333, 0.0, 1.0, 0.0, 1.0], [0.4596632319678311, 0.10364429745562033,0.0,0.125,1.0,0.0,0.0],[0.19577783362653933,0.01798980030808316,0.0,0.0,1.0,1.0,0.0],[0.2963056044232219,0.15458810467956932,0.0,0.0,0.0,0.0,0.0],[0.3465694898215632,0.029757819776815374, 0.3333333333333333,0.0,0.0,1.0,1.0],[0.2711736617240513,0.015126992566498259,0.0,0.0,1.0,1.0,1.0],[0.2963056044232219,0.030937139635999665,0.0,0.125,1.0,1.0,1.0],[0.23347574767529528,0.013175122557917838,0.0, 0.0,0.5,1.0,0.0],[0.22090977632570996,0.022446505098674834,0.0,0.0,1.0,0.5,0.0],[0.23347574767529528,0.07173122281533045,0.16666666666666666,0.125,1.0,0.5,0.0],[0.33400351847197784,0.015216388212891242,0.0,0.0, 1.0,1.0,0.0],[0.10781603417944208,0.06709553154495196,0.3333333333333333,0.25,1.0,1.0,1.0],[0.4533802462930384,0.050748620223090936,0.3333333333333333,0.0,1.0,0.5,0.0],[0.5224930887157577,0.025374310111545468,0.0,0.0,1.0, 0.5, 0.0], [0.6355868308620256, 0.02444717185746977, 0.0, 0.0, 1。 0,0.5,0.0],[0.2711736617240513,0.12999454257145598,0.0,0.125,1.0,0.0,1.0],[0.6921337019351596,0.015712553569072387,0.0,0.0,1.0,1.0,0.0],[0.5036441316913798,0.028302115124416098,0.3333333333333333,0.0,1.0, 1.0,0.0],[0.3465694898215632,0.014273049437744325,0.0,0.0,1.0,1.0,0.0],[0.6355868308620256,0.11980421963065935,0.16666666666666666,0.0,0.0,0.0,0.0],[0.19577783362653933,0.015094396337354966,0.0,0.0,0.5,1.0, 1.0],[0.37170143252073384,0.015712553569072387,0.0,0.0,1.0,1.0,0.0],[0.3465694898215632,0.016908073949327893,0.0,0.0,1.0,1.0,0.0],[0.3465694898215632,0.13575255909676823,0.3333333333333333,1.0,1.0,1.0,0.0] [0.5476250314149284,0.031425107138144774,0.16666666666666666,0.0,1.0,1.0,0.0],[0.4973611460165871,0.030741952635141623,0.0,0.0,1.0,0.5,1.0],[0.32143754712239253,0.015175789316712771,0.0,0.0,1.0,1.0,0.0],[ 0.20834380497612465, 0.016908073949327893, 0.0, 0.0, 1.0, 1.0, 0.0], [0.007288263382759489, 0 0.07746484096553544,0.16666666666666666,0.5,1.0,1.0,0.0],[0.10781603417944208,0.04006213192611313,0.3333333333333333,0.0,1.0,1.0,0.0],[0.3465694898215632,0.10735285047192313,0.16666666666666666,0.0,1.0,0.0,1.0],[0.5601910027645137,0.05445717323939373 ,0.6666666666666666,0.125,1.0,1.0,1.0],[0.3465694898215632,0.050602229972447406,0.0,0.0,1.0,0.0,0.0],[0.3465694898215632,0.11027245763075773,0.0,0.0,1.0,1.0,0.0],[0.7612465443578789,0.06538764528744409,0.0 ,0.0,1.0,0.0,0.0],[0.04498617743151546,0.056848213999904744,0.16666666666666666,0.5%,0.5%,1.0,0.0],[0.007288263382759489,0.021730754366528396,0.16666666666666666,0.125,1.0,1.0,1.0],[0.2586076903744659,0.015468569817999833,0.0,0.0 ,1.0,1.0,0.0],[0.6984166876099522,0.05991421140938287,0.0,0.0,0.0,0.0,0.0],[0.22090977632570996,0.015330377421392339,0.16666666666666666,0.125,1.0,1.0,0.0],[0.3465694898215632,0.04970768794751499,0.16666666666666666,0.375,1.0 , 1.0, 0.0],[0.6230208595124404,0.05604306762136532,0.0,0.0,0.0,0.0,1.0],[0.37170143252073384,0.025374310111545468,0.0,0.0,1.0,0.5,0.0],[0.4470972606182458,0.0,0.0,0.0,1.0,1.0,0.0] [0.3465694898215632,0.13575255909676823,0.3333333333333333,1.0,1.0,1.0,1.0],[0.3465694898215632,0.02937564362913533,0.0,0.0,0.0,0.5,0.0],[0.10781603417944208,0.061264319894317944,0.3333333333333333,0.5,1.0,1.0,0.0],[ 0.007288263382759489,0.0761229303346364,0.16666666666666666,0.25,1.0,0.5,0.0],[0.04498617743151546,0.04298993693898376,0.3333333333333333,0.0,1.0,1.0,1.0],[0.3465694898215632,0.09759350042902103,0.0,0.0,1.0,0.0,0.0],[0.3465694898215632, 0.030253985132996517,0.0,0.125,0.5,1.0,1.0],[0.5601910027645137,0.051822148727810165,0.0,0.0,1.0,0.0,0.0],[0.4973611460165871,0.030253985132996517,0.16666666666666666,0.125,0.5,1.0,0.0],[0.4470972606182458,0.015411575213749284, 0.0, 0.0, 1.0, 1.0, 0.0], [0.39683337521990447, 0.0253743101 11545468,0.0,0.0,1.0,0.5,1.0],[0.23347574767529528,0.025374310111545468,0.0,0.0,1.0,0.5,0.0],[0.23347574767529528,0.015330377421392339,0.0,0.125,1.0,1.0,1.0],[0.032420206081930136,0.050748620223090936, 0.16666666666666666,0.125,1.0,0.5,0.0],[0.5476250314149284,0.05410739813385612,0.0,0.0,0.0,0.0,1.0],[0.7235486303091229,0.2859895551532101,0.0,0.0,0.0,0.0,1.0],[0.3465694898215632,0.015126992566498259,0.0, 0.0,0.5,1.0,0.0],[0.5224930887157577,0.01640390592611157,0.16666666666666666,0.0,1.0,1.0,0.0],[0.3465694898215632,0.015126992566498259,0.0,0.0,0.5,1.0,1.0],[0.2963056044232219,0.025374310111545468,0.0,0.0, 1.0, 0.5, 1.0]]

ytf = [[0.0], [1.0], [1.0], [1.0], [0.0], [0.0], [0.0], [0.0], [1.0], [1.0], [1.0], [1.0], [0.0], [0.0], [0.0], [1.0], [0.0], [1.0], [0.0], [1.0], [0.0], [1.0], [1.0], [1.0] ], [0.0], [1.0], [0.0], [0.0], [1.0], [0.0], [0.0], [1.0], [1.0], [0.0], [0.0], [0.0], [1.0], [0.0], [0.0], [1.0], [0.0], [0.0], [0.0], [1.0], [1.0], [0.0], [0.0], [1.0], [0.0] ], [0.0], [0.0], [0.0], [1.0], [1.0], [0.0], [1.0], [1.0], [0.0], [1.0], [0.0], [0.0], [1.0], [0.0], [0.0], [0.0], [1.0], [1.0], [0.0], [1.0], [0.0], [0.0], [0.0], [0.0], [0.0] ], [1.0], [0.0], [0.0], [0.0], [1.0], [1.0], [0.0], [1.0], [1.0], [0.0], [1.0], [1.0], [0.0], [0.0], [1.0], [0.0], [0.0], [0.0], [0.0], [0.0], [0.0], [0.0], [0.0], [1.0], [1.0] ], [0.0], [0.0], [0.0], [0.0], [0.0], [0.0], [0.0], [1.0], [1.0], [0.0], [1.0], [0.0], [0.0], [0.0], [0.0], [0.0], [0.0], [0.0], [0.0], [0.0], [0.0], [0.0], [0.0], [0.0], [1.0 ], [0.0], [1.0], [0.0], [1.0], [1.0], [0.0], [0.0], [0.0], [0.0], [1.0], [0.0], [0.0], [1.0]、[0.0]、[0.0]、[0.0]、[0.0]、[1. 0]、[1.0]、[0.0]、[0.0]、[0.0]、[1.0]、[0.0]、[0.0]、[0.0]、[0.0]、[1.0]、[0.0]、[0.0] , [0.0], [0.0], [1.0], [0.0], [0.0], [0.0], [0.0], [1.0], [0.0], [0.0], [0.0], [1.0], [ 1.0]、[0.0]、[0.0]、[0.0]、[0.0]、[0.0]、[1.0]、[0.0]、[0.0]、[0.0]、[0.0]、[0.0]、[0.0] , [0.0], [0.0], [0.0], [0.0], [1.0], [1.0], [0.0], [1.0], [1.0], [0.0], [0.0], [1.0], [ 0.0]、[1.0]、[1.0]、[1.0]、[1.0]、[0.0]、[0.0]、[1.0]、[0.0]]

XT(测试数据)= [[0.45272319662402744,0.015281580671177828,0.0,0.0,0.5,1.0,0.0],[0.6175656072794409,0.013663090060062943,0.0,0.125,1.0,1.0,1.0],[0.815376500065937,0.018908740708122825,0.0,0.0 ,0.5%,0.5,0.0],[0.3538177502307794,0.016908073949327893,0.0,0.0,1.0,1.0,0.0],[0.287880785968614,0.023983602730431916,0.1111111111111111,0.125,1.0,1.0,1.0],[0.18238164314914942,0.01800600082915438,0.0,0.0,1.0 ,1.0,0.0],[0.393799287880786,0.014891206694617444,0.0,0.0,0.5,1.0,1.0],[0.0563035783463,0.0566035783463,0.0566035783463,0.05660423048832196,0.0566035783463,0.0566035783463,0.05660423048832196,0.111111111111111,10.125,1.0,0.5,0.0]] P>

【问题讨论】:

如何在每次迭代时打印一些变量,例如cross_entropyytf,看看网络是如何收敛的? (一个随机的想法:如果数据集一开始全是 0,然后全是 1,它可能会“忘记”大约 0) Tensorflow sigmoid and cross entropy vs sigmoid_cross_entropy_with_logits的可能重复 【参考方案1】:

您正在将categorical cross entropy 函数应用于binary classification

y_ * tf.log(ytf)

这里有tf.log(ytf),所以如果ytf1,那么tf.log(ytf) 的计算结果总是0。所以你的函数永远不会学习。

你需要使用binary cross entropy函数

y_ * tf.log(ytf) + (1. - y_) * tf.log(1. - ytf)

在训练时,始终将损失作为输出并保持打印一次,这将帮助您更快地调试模型。

_, training_loss = sess.run([train_step, cross_entropy], feed_dict=xtf: x_batch, y_: y_batch)

【讨论】:

现在它将所有值都设为 nan。 将您的ytf 剪辑为01 附近的一些值,如tf.clip_by_value(ytf, 0.02, 0.98) 它仍然无法正常工作。如果您想查看,我已经包含了我的训练数据。 用训练打印你的loss 并报告损失。是0 还是nan 损失是nan。

以上是关于TensorFlow 神经网络只预测 1 作为结果的主要内容,如果未能解决你的问题,请参考以下文章

在 Keras 中使用 LSTM 预测股票(Python 3.7、Tensorflow 2.1.0)

TensorFlow 中的神经网络比随机森林效果更差,并且每次都预测相同的标签

年度系列使用Tensorflow预测股票市场变动

使用机器学习预测天气(第三部分神经网络)

睿智的目标检测54——Tensorflow2 搭建YoloX目标检测平台

Tensorflow系列3:多层神经网络--解决非线性问题