训练GAN时喀拉拉邦内存泄漏
Posted
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了训练GAN时喀拉拉邦内存泄漏相关的知识,希望对你有一定的参考价值。
我正在尝试使用keras训练GAN,问题是我不断填充RAM ...这是我用来训练的代码:
import gc
cont=0
while cont<20:
cont+=1
img_to_train_discr=image_generator(8)
#it reurns a tuple(image, 0/1)
discr.train_on_batch(img_to_train_discr[0], img_to_train_discr[1])
img_to_train_gan=image_generator_for_gan(8)
gan.train_on_batch(img_to_train_gan[0],img_to_train_gan[1])
found_objects = gc.get_objects()
。fit和.train_on_batch都显示了使用记忆在各个时期的增加
我加入了gc.get_object,因为我想调查哪些元素未被删除
我遍历了清单found_objects,并找到了可能的问题原因。
它正在保存值...
但是使用.fit我看到使用.get_objects它保存了一些张量,例如:在gan上使用.fit时发现以下内容
tf.Tensor(
[[[[ 0.8039216 0.8039216 0.8039216 ]
[ 0.77254903 0.77254903 0.77254903]
[ 0.7647059 0.7647059 0.7647059 ]
...
[ 0.9843137 0.9843137 0.9843137 ]
[ 0.9843137 0.9843137 0.9843137 ]
[ 0.9843137 0.9843137 0.9843137 ]]
[[ 0.14509805 0.14509805 0.14509805]
[-0.00392157 -0.00392157 -0.00392157]
[-0.19215687 -0.19215687 -0.19215687]
...
[ 0.9843137 0.9843137 0.9843137 ]
[ 0.9843137 0.9843137 0.9843137 ]
[ 0.9843137 0.9843137 0.9843137 ]]
[[-0.37254903 -0.37254903 -0.37254903]
[-0.34901962 -0.34901962 -0.34901962]
[-0.29411766 -0.29411766 -0.29411766]
...
[ 0.9843137 0.9843137 0.9843137 ]
[ 0.9843137 0.9843137 0.9843137 ]
[ 0.9843137 0.9843137 0.9843137 ]]
...
[[-0.99215686 -0.99215686 -0.99215686]
[-1. -1. -1. ]
[-0.9843137 -0.9843137 -0.9843137 ]
...
[-0.7019608 -0.7019608 -0.7019608 ]
[-0.81960785 -0.81960785 -0.81960785]
[-0.40392157 -0.40392157 -0.40392157]]
[[-0.8352941 -0.8352941 -0.8352941 ]
[-0.9843137 -0.9843137 -0.9843137 ]
[-0.9529412 -0.9529412 -0.9529412 ]
...
[-0.5921569 -0.5921569 -0.5921569 ]
[-0.77254903 -0.77254903 -0.77254903]
[-0.42745098 -0.42745098 -0.42745098]]
[[-0.654902 -0.654902 -0.654902 ]
[-0.90588236 -0.90588236 -0.90588236]
[-0.8666667 -0.8666667 -0.8666667 ]
...
[-0.77254903 -0.77254903 -0.77254903]
[-0.8509804 -0.8509804 -0.8509804 ]
[-0.5529412 -0.5529412 -0.5529412 ]]]
[[[ 0.5764706 0.5764706 0.5764706 ]
[ 0.5921569 0.5921569 0.5921569 ]
[ 0.60784316 0.60784316 0.60784316]
...
[ 0.5372549 0.5372549 0.5372549 ]
[ 0.5058824 0.5058824 0.5058824 ]
[ 0.49803922 0.49803922 0.49803922]]
[[ 0.58431375 0.58431375 0.58431375]
[ 0.6 0.6 0.6 ]
[ 0.6156863 0.6156863 0.6156863 ]
...
[ 0.5294118 0.5294118 0.5294118 ]
[ 0.5058824 0.5058824 0.5058824 ]
[ 0.49803922 0.49803922 0.49803922]]
[[ 0.6 0.6 0.6 ]
[ 0.60784316 0.60784316 0.60784316]
[ 0.6156863 0.6156863 0.6156863 ]
...
[ 0.5294118 0.5294118 0.5294118 ]
[ 0.5294118 0.5294118 0.5294118 ]
[ 0.5137255 0.5137255 0.5137255 ]]
...
[[-0.8901961 -0.8901961 -0.8901961 ]
[-0.8352941 -0.8352941 -0.8352941 ]
[-0.6784314 -0.6784314 -0.6784314 ]
...
[-0.99215686 -0.99215686 -0.99215686]
[-1. -1. -1. ]
[-1. -1. -1. ]]
[[-0.9137255 -0.9137255 -0.9137255 ]
[-0.8901961 -0.8901961 -0.8901961 ]
[-0.56078434 -0.56078434 -0.56078434]
...
[-0.99215686 -0.99215686 -0.99215686]
[-1. -1. -1. ]
[-1. -1. -1. ]]
[[-0.77254903 -0.77254903 -0.77254903]
[-0.75686276 -0.75686276 -0.75686276]
[-0.7411765 -0.7411765 -0.7411765 ]
...
[-1. -1. -1. ]
[-1. -1. -1. ]
[-1. -1. -1. ]]]
[[[-0.94509804 -0.94509804 -0.94509804]
[-0.88235295 -0.88235295 -0.88235295]
[-0.8117647 -0.8117647 -0.8117647 ]
...
[-0.9372549 -0.9372549 -0.9372549 ]
[-0.8745098 -0.8745098 -0.8745098 ]
[-0.9372549 -0.9372549 -0.9372549 ]]
[[-0.9607843 -0.9607843 -0.9607843 ]
[-0.94509804 -0.94509804 -0.94509804]
[-0.7647059 -0.7647059 -0.7647059 ]
...
[-0.9529412 -0.9529412 -0.9529412 ]
[-0.8980392 -0.8980392 -0.8980392 ]
[-0.9372549 -0.9372549 -0.9372549 ]]
[[-0.9372549 -0.9372549 -0.9372549 ]
[-0.9607843 -0.9607843 -0.9607843 ]
[-0.7411765 -0.7411765 -0.7411765 ]
...
[-0.9607843 -0.9607843 -0.9607843 ]
[-0.92156863 -0.92156863 -0.92156863]
[-0.9137255 -0.9137255 -0.9137255 ]]
...
[[ 0.10588235 0.10588235 0.10588235]
[ 0.10588235 0.10588235 0.10588235]
[-0.01176471 -0.01176471 -0.01176471]
...
[-0.19215687 -0.19215687 -0.19215687]
[-0.23921569 -0.23921569 -0.23921569]
[-0.19215687 -0.19215687 -0.19215687]]
[[ 0.09019608 0.09019608 0.09019608]
[ 0.11372549 0.11372549 0.11372549]
[ 0.13725491 0.13725491 0.13725491]
...
[ 0.01176471 0.01176471 0.01176471]
[-0.05882353 -0.05882353 -0.05882353]
[-0.07450981 -0.07450981 -0.07450981]]
[[-0.08235294 -0.08235294 -0.08235294]
[-0.15294118 -0.15294118 -0.15294118]
[-0.09803922 -0.09803922 -0.09803922]
...
[-0.15294118 -0.15294118 -0.15294118]
[-0.01176471 -0.01176471 -0.01176471]
[-0.03529412 -0.03529412 -0.03529412]]]
...
[[[-0.54509807 -0.54509807 -0.54509807]
[-0.54509807 -0.54509807 -0.54509807]
[-0.4117647 -0.4117647 -0.4117647 ]
...
[-0.3647059 -0.3647059 -0.3647059 ]
[ 0.37254903 0.37254903 0.37254903]
[ 0.38039216 0.38039216 0.38039216]]
[[-0.38039216 -0.38039216 -0.38039216]
[-0.14509805 -0.14509805 -0.14509805]
[-0.11372549 -0.11372549 -0.11372549]
...
[-0.3882353 -0.3882353 -0.3882353 ]
[-0.21568628 -0.21568628 -0.21568628]
[ 0.16862746 0.16862746 0.16862746]]
[[-0.06666667 -0.06666667 -0.06666667]
[ 0.06666667 0.06666667 0.06666667]
[-0.28627452 -0.28627452 -0.28627452]
...
[ 0.38039216 0.38039216 0.38039216]
[-0.44313726 -0.44313726 -0.44313726]
[ 0.21568628 0.21568628 0.21568628]]
...
[[ 0.21568628 0.21568628 0.21568628]
[ 0.06666667 0.06666667 0.06666667]
[-0.04313726 -0.04313726 -0.04313726]
...
[-0.60784316 -0.60784316 -0.60784316]
[-0.6156863 -0.6156863 -0.6156863 ]
[-0.5686275 -0.5686275 -0.5686275 ]]
[[ 0.31764707 0.31764707 0.31764707]
[ 0.10588235 0.10588235 0.10588235]
[-0.2784314 -0.2784314 -0.2784314 ]
...
[-0.42745098 -0.42745098 -0.42745098]
[-0.4509804 -0.4509804 -0.4509804 ]
[-0.54509807 -0.54509807 -0.54509807]]
[[ 0.12941177 0.12941177 0.12941177]
[-0.08235294 -0.08235294 -0.08235294]
[-0.04313726 -0.04313726 -0.04313726]
...
[-0.79607844 -0.79607844 -0.79607844]
[-0.5686275 -0.5686275 -0.5686275 ]
[-0.2 -0.2 -0.2 ]]]
[[[-0.9529412 -0.9529412 -0.9529412 ]
[-0.79607844 -0.79607844 -0.79607844]
[-0.6156863 -0.6156863 -0.6156863 ]
...
[-0.44313726 -0.44313726 -0.44313726]
[-0.79607844 -0.79607844 -0.79607844]
[-0.73333335 -0.73333335 -0.73333335]]
[[-1. -1. -1. ]
[-0.90588236 -0.90588236 -0.90588236]
[-0.6313726 -0.6313726 -0.6313726 ]
...
[-0.3019608 -0.3019608 -0.3019608 ]
[-0.8352941 -0.8352941 -0.8352941 ]
[-0.7647059 -0.7647059 -0.7647059 ]]
[[-1. -1. -1. ]
[-0.99215686 -0.99215686 -0.99215686]
[-0.8039216 -0.8039216 -0.8039216 ]
...
[-0.29411766 -0.29411766 -0.29411766]
[-0.8117647 -0.8117647 -0.8117647 ]
[-0.6862745 -0.6862745 -0.6862745 ]]
...
[[-0.90588236 -0.90588236 -0.90588236]
[-0.81960785 -0.81960785 -0.81960785]
[-0.8117647 -0.8117647 -0.8117647 ]
...
[-0.7647059 -0.7647059 -0.7647059 ]
[-0.88235295 -0.88235295 -0.88235295]
[-0.9137255 -0.9137255 -0.9137255 ]]
[[-1. -1. -1. ]
[-0.9764706 -0.9764706 -0.9764706 ]
[-0.9529412 -0.9529412 -0.9529412 ]
...
[-0.8117647 -0.8117647 -0.8117647 ]
[-0.8352941 -0.8352941 -0.8352941 ]
[-0.8509804 -0.8509804 -0.8509804 ]]
[[-0.6862745 -0.6862745 -0.6862745 ]
[-0.62352943 -0.62352943 -0.62352943]
[-0.7411765 -0.7411765 -0.7411765 ]
...
[-0.8117647 -0.8117647 -0.8117647 ]
[-0.77254903 -0.77254903 -0.77254903]
[-0.84313726 -0.84313726 -0.84313726]]]
[[[-0.69411767 -0.69411767 -0.69411767]
[-0.6784314 -0.6784314 -0.6784314 ]
[-0.6627451 -0.6627451 -0.6627451 ]
...
[-0.8509804 -0.8509804 -0.8509804 ]
[-0.8509804 -0.8509804 -0.8509804 ]
[-0.8509804 -0.8509804 -0.8509804 ]]
[[-0.70980394 -0.70980394 -0.70980394]
[-0.69411767 -0.69411767 -0.69411767]
[-0.67058825 -0.67058825 -0.67058825]
...
[-0.8509804 -0.8509804 -0.8509804 ]
[-0.8509804 -0.8509804 -0.8509804 ]
[-0.8509804 -0.8509804 -0.8509804 ]]
[[-0.7176471 -0.7176471 -0.7176471 ]
[-0.69411767 -0.69411767 -0.69411767]
[-0.6784314 -0.6784314 -0.6784314 ]
...
[-0.8509804 -0.8509804 -0.8509804 ]
[-0.8509804 -0.8509804 -0.8509804 ]
[-0.8509804 -0.8509804 -0.8509804 ]]
...
[[-0.6313726 -0.6313726 -0.6313726 ]
[-0.62352943 -0.62352943 -0.62352943]
[-0.62352943 -0.62352943 -0.62352943]
...
[-0.8509804 -0.8509804 -0.8509804 ]
[-0.8666667 -0.8666667 -0.8666667 ]
[-0.8745098 -0.8745098 -0.8745098 ]]
[[-0.6156863 -0.6156863 -0.6156863 ]
[-0.6156863 -0.6156863 -0.6156863 ]
[-0.60784316 -0.60784316 -0.60784316]
...
[-0.8509804 -0.8509804 -0.8509804 ]
[-0.8666667 -0.8666667 -0.8666667 ]
[-0.8745098 -0.8745098 -0.8745098 ]]
[[-0.6156863 -0.6156863 -0.6156863 ]
[-0.6156863 -0.6156863 -0.6156863 ]
[-0.60784316 -0.60784316 -0.60784316]
...
[-0.8509804 -0.8509804 -0.8509804 ]
[-0.8666667 -0.8666667 -0.8666667 ]
[-0.8666667 -0.8666667 -0.8666667 ]]]], shape=(16, 256, 256, 3), dtype=float32)
692453
tf.Tensor(
[[1.]
[1.]
[1.]
[1.]
[1.]
[1.]
[1.]
[1.]
[1.]
[1.]
[1.]
[1.]
[1.]
[1.]
[1.]
[1.]], shape=(16, 1), dtype=float32)
这是仅培训光盘
[<tf.Tensor: shape=(32, 256, 256, 3), dtype=float32, numpy=
array([[[[ 6.94478676e-03, -2.90532247e-03, 7.25293392e-03],
[ 1.20958146e-02, -1.07863108e-02, 1.04020014e-02],
[ 1.69709120e-02, -2.54366547e-02, 1.98477823e-02],
...,
[-4.30019619e-03, -8.35454836e-03, -2.21172324e-03],
[-4.14159754e-03, -1.14777510e-03, -1.21566129e-03],
[ 1.36303401e-03, 6.04543777e-04, -1.35964795e-03]],
[[ 2.50564199e-02, -6.16334006e-03, 1.92856099e-02],
[ 3.54985110e-02, -1.79717932e-02, 2.98348404e-02],
[ 2.62675621e-02, -1.90307051e-02, 2.65689045e-02],
...,
[-1.57777814e-03, -6.14548009e-03, 5.52629726e-03],
[ 3.56815499e-03, -6.90740068e-03, -7.03096506e-04],
[ 9.26138775e-04, -1.85872870e-03, 3.02374363e-04]],
[[ 2.74749734e-02, -1.49438502e-02, 2.80325040e-02],
[ 5.10839783e-02, -1.75167620e-02, 2.70463582e-02],
[ 3.75709981e-02, -2.34040022e-02, 2.50053518e-02],
...,
[ 8.94943625e-03, -1.73010174e-02, 1.82440877e-02],
[ 4.39342530e-03, -1.31681236e-02, 8.13111849e-03],
[ 4.34517069e-03, -4.70215734e-03, -1.63908151e-03]],
...,
[[ 6.95652468e-03, -3.63357402e-02, 4.07949500e-02],
[ 4.13575359e-02, -4.91991192e-02, 3.21018584e-02],
[ 4.74223010e-02, -7.47634992e-02, 2.35863868e-02],
...,
[ 8.26232806e-02, -6.68739378e-02, -6.99709053e-04],
[ 7.23878071e-02, -5.69532141e-02, -4.85424437e-02],
[ 2.66422518e-02, -3.07060555e-02, -5.80600202e-02]],
[[ 4.50124545e-03, -3.43432538e-02, 3.71103324e-02],
[ 4.32977863e-02, -4.92802262e-02, 3.27052958e-02],
[ 4.84924354e-02, -6.66223019e-02, 2.72663124e-02],
...,
[ 7.71504492e-02, -7.50505701e-02, 2.73561082e-03],
[ 8.03824887e-02, -6.13293871e-02, -3.52067165e-02],
[ 2.08804533e-02, -2.86836233e-02, -5.02964184e-02]],
[[-3.90984351e-03, -2.32026614e-02, 2.67444160e-02],
[ 1.65205617e-02, -3.42688598e-02, 1.98613424e-02],
[ 2.70076040e-02, -5.75522073e-02, 1.99076571e-02],
...,
[ 5.09059504e-02, -5.42278290e-02, 1.30892009e-03],
[ 6.47045597e-02, -3.80333811e-02, -2.18609013e-02],
[ 3.41063663e-02, -1.05063524e-02, -3.07822768e-02]]],
[[[ 1.00000000e+00, 1.00000000e+00, 9.92156863e-01],
[ 1.00000000e+00, 9.92156863e-01, 9.84313726e-01],
[ 9.92156863e-01, 9.76470590e-01, 9.76470590e-01],
...,
[ 9.76470590e-01, 1.00000000e+00, 9.84313726e-01],
[ 9.92156863e-01, 9.84313726e-01, 1.00000000e+00],
[ 1.00000000e+00, 9.84313726e-01, 1.00000000e+00]],
[[ 9.92156863e-01, 1.00000000e+00, 9.84313726e-01],
[ 1.00000000e+00, 1.00000000e+00, 9.92156863e-01],
[ 9.92156863e-01, 9.92156863e-01, 9.92156863e-01],
...,
[ 9.84313726e-01, 1.00000000e+00, 9.92156863e-01],
[ 1.00000000e+00, 9.92156863e-01, 1.00000000e+00],
[ 1.00000000e+00, 9.84313726e-01, 1.00000000e+00]],
[[ 9.84313726e-01, 1.00000000e+00, 1.00000000e+00],
[ 9.29411769e-01, 9.45098042e-01, 9.45098042e-01],
[ 9.68627453e-01, 9.84313726e-01, 9.84313726e-01],
...,
[ 9.76470590e-01, 1.00000000e+00, 9.84313726e-01],
[ 1.00000000e+00, 9.84313726e-01, 1.00000000e+00],
[ 1.00000000e+00, 9.92156863e-01, 1.00000000e+00]],
...,
[[ 1.00000000e+00, 9.92156863e-01, 1.00000000e+00],
[ 1.00000000e+00, 9.92156863e-01, 1.00000000e+00],
[ 1.00000000e+00, 1.00000000e+00, 1.00000000e+00],
...,
[ 9.68627453e-01, 9.84313726e-01, 9.84313726e-01],
[ 1.00000000e+00, 1.00000000e+00, 1.00000000e+00],
[ 1.00000000e+00, 1.00000000e+00, 1.00000000e+00]],
[[ 9.84313726e-01, 1.00000000e+00, 1.00000000e+00],
[ 9.76470590e-01, 9.92156863e-01, 9.92156863e-01],
[ 9.84313726e-01, 1.00000000e+00, 1.00000000e+00],
...,
[ 9.76470590e-01, 9.92156863e-01, 9.92156863e-01],
[ 1.00000000e+00, 9.92156863e-01, 1.00000000e+00],
[ 1.00000000e+00, 9.92156863e-01, 1.00000000e+00]],
[[ 9.68627453e-01, 1.00000000e+00, 1.00000000e+00],
[ 9.60784316e-01, 1.00000000e+00, 9.92156863e-01],
[ 9.84313726e-01, 1.00000000e+00, 1.00000000e+00],
...,
[ 9.60784316e-01, 1.00000000e+00, 9.92156863e-01],
[ 1.00000000e+00, 9.92156863e-01, 1.00000000e+00],
[ 1.00000000e+00, 9.92156863e-01, 1.00000000e+00]]],
[[[-9.76470590e-01, -1.00000000e+00, -1.05882354e-01],
[-9.76470590e-01, -1.00000000e+00, -9.01960805e-02],
[-9.60784316e-01, -9.92156863e-01, -5.88235296e-02],
...,
[-7.45098069e-02, -4.74509805e-01, -2.07843140e-01],
[ 1.05882354e-01, -3.56862754e-01, -1.37254909e-01],
[ 2.54901975e-01, -2.54901975e-01, -5.88235296e-02]],
[[-9.76470590e-01, -1.00000000e+00, -7.45098069e-02],
[-9.84313726e-01, -1.00000000e+00, -6.66666701e-02],
[-9.84313726e-01, -9.92156863e-01, -5.09803928e-02],
...,
[-1.05882354e-01, -4.98039216e-01, -2.31372550e-01],
[ 7.45098069e-02, -3.80392164e-01, -1.60784319e-01],
[ 2.07843140e-01, -2.78431386e-01, -9.01960805e-02]],
[[-1.00000000e+00, -1.00000000e+00, -6.66666701e-02],
[-1.00000000e+00, -1.00000000e+00, -5.88235296e-02],
[-1.00000000e+00, -9.92156863e-01, -3.52941193e-02],
...,
[-2.00000003e-01, -5.52941203e-01, -2.94117659e-01],
[ 3.92156886e-03, -4.19607848e-01, -1.92156866e-01],
[ 1.52941182e-01, -3.01960796e-01, -1.05882354e-01]],
...,
[[-1.00000000e+00, -5.52941203e-01, -4.66666669e-01],
[-8.27450991e-01, -2.31372550e-01, -1.84313729e-01],
[-8.50980401e-01, -8.23529437e-02, -1.29411772e-01],
...,
[-8.50980401e-01, -7.25490212e-01, -4.98039216e-01],
[-7.17647076e-01, -5.52941203e-01, -3.33333343e-01],
[-1.00000000e+00, -8.27450991e-01, -6.07843161e-01]],
[[-1.00000000e+00, -5.13725519e-01, -4.11764711e-01],
[-8.35294127e-01, -2.47058824e-01, -1.84313729e-01],
[-9.05882359e-01, -1.68627456e-01, -2.07843140e-01],
...,
[-1.00000000e+00, -1.00000000e+00, -7.56862760e-01],
[-8.43137264e-01, -6.70588255e-01, -4.50980395e-01],
[-8.03921580e-01, -5.60784340e-01, -3.64705890e-01]],
[[-1.00000000e+00, -5.29411793e-01, -4.27450985e-01],
[-9.21568632e-01, -3.09803933e-01, -2.54901975e-01],
[-9.52941179e-01, -2.47058824e-01, -2.86274523e-01],
...,
[-1.00000000e+00, -9.68627453e-01, -6.94117665e-01],
[-1.00000000e+00, -9.29411769e-01, -7.09803939e-01],
[-9.29411769e-01, -6.62745118e-01, -4.74509805e-01]]],
...,
[[[-1.05062379e-02, -1.98420249e-02, 1.05182398e-02],
[-3.95061001e-02, -2.57582217e-02, 1.40950643e-02],
[-2.30170805e-02, -2.37071346e-02, -4.61883796e-03],
...,
[-2.45160554e-02, -9.46635101e-03, -6.07647886e-03],
[-3.07144760e-03, 2.74786772e-03, -6.80177147e-03],
[ 5.86585980e-03, 2.40193726e-03, 3.39358579e-04]],
[[ 3.22993868e-03, -1.12008387e-02, 3.77045646e-02],
[-9.38666333e-03, -3.21227647e-02, 2.93544959e-02],
[-1.12627428e-02, -1.63189527e-02, 4.86864848e-03],
...,
[-2.86157615e-02, -8.67746118e-03, -9.11490759e-04],
[-1.50391981e-02, -5.08068223e-03, -9.21393745e-03],
[ 6.01480622e-03, -8.89253570e-04, 5.72130177e-03]],
[[ 2.37215031e-02, 1.73019955e-03, 3.52669656e-02],
[ 2.20054798e-02, 3.41841788e-03, 2.78164726e-02],
[ 2.26932168e-02, 2.25211773e-02, -7.15107657e-03],
...,
[-9.92084946e-03, -7.83571042e-03, 5.36113139e-03],
[-3.63909150e-03, -2.15192046e-02, 1.81183417e-03],
[ 9.87425633e-03, -1.63576566e-02, 9.68800485e-03]],
...,
[[ 9.26712807e-03, -3.34203020e-02, 3.94128822e-02],
[ 4.19912934e-02, -4.55853753e-02, 3.37843001e-02],
[ 4.08300571e-02, -6.73395097e-02, 2.53548753e-02],
...,
[ 8.61984789e-02, -7.02210069e-02, -4.39706072e-03],
[ 6.94279298e-02, -5.77976443e-02, -4.75803465e-02],
[ 2.45513227e-02, -3.38402092e-02, -5.75863346e-02]],
[[ 6.77845301e-03, -3.54054347e-02, 3.67174037e-02],
[ 4.35878709e-02, -4.94687334e-02, 3.45391147e-02],
[ 4.71395329e-02, -7.13703632e-02, 2.63372287e-02],
...,
[ 8.29759017e-02, -7.53538832e-02, 1.60004944e-04],
[ 8.16767067e-02, -6.00483567e-02, -3.75034474e-02],
[ 1.97965931e-02, -3.06959040e-02, -5.22228405e-02]],
[[-2.94655445e-03, -1.86929759e-02, 2.33796220e-02],
[ 1.59196425e-02, -3.28605361e-02, 1.64255649e-02],
[ 2.53022909e-02, -4.75350842e-02, 1.15010655e-02],
...,
[ 5.44254147e-02, -5.55038191e-02, -1.54604076e-03],
[ 6.76389188e-02, -3.61473970e-02, -2.54233293e-02],
[ 3.42441052e-02, -9.63416602e-03, -3.30452174e-02]]],
[[[ 2.54648067e-02, -1.19450670e-02, 3.30261998e-02],
[ 4.23403606e-02, -4.13185284e-02, 3.81897315e-02],
[ 4.00563851e-02, -6.79321066e-02, 4.91125546e-02],
...,
[ 4.02044021e-02, -5.85264936e-02, 5.48310988e-02],
[ 2.70577967e-02, -4.31953967e-02, 3.57147492e-02],
[ 6.32039411e-03, -2.48100758e-02, -7.03164516e-03]],
[[ 3.70886363e-02, -2.01733522e-02, 6.05700798e-02],
[ 7.77267516e-02, -5.13126105e-02, 6.01464622e-02],
[ 8.56612101e-02, -8.36809576e-02, 7.61673301e-02],
...,
[ 8.45839083e-02, -4.61416878e-02, 6.01974353e-02],
[ 5.06575927e-02, -2.32018791e-02, 2.58594193e-02],
[ 1.53260147e-02, -1.76541489e-02, -2.82484554e-02]],
[[ 2.97332872e-02, -2.54155342e-02, 7.12449625e-02],
[ 9.13045332e-02, -6.03631884e-02, 7.43178874e-02],
[ 9.26255956e-02, -9.32793990e-02, 7.50018954e-02],
...,
[ 1.09376043e-01, -5.31297959e-02, 4.94755656e-02],
[ 7.15198442e-02, -3.02166399e-02, 1.11023467e-02],
[ 1.66346878e-02, -3.10882907e-02, -3.92567255e-02]],
...,
[[ 1.50115313e-02, -5.51447719e-02, 6.36151060e-02],
[ 7.06077367e-02, -7.18016624e-02, 5.44297658e-02],
[ 6.90411255e-02, -1.04166776e-01, 3.75158228e-02],
...,
[ 9.69356075e-02, -7.91200697e-02, -8.72911653e-04],
[ 8.63938630e-02, -6.63577765e-02, -5.73743023e-02],
[ 3.16326991e-02, -3.84405665e-02, -6.70467839e-02]],
[[ 1.09571004e-02, -5.76814674e-02, 5.85661493e-02],
[ 7.11029768e-02, -7.61615336e-02, 5.38719222e-02],
[ 7.62823075e-02, -1.09212406e-01, 3.92470434e-02],
...,
[ 9.16330442e-02, -8.96104947e-02, 4.14223457e-03],
[ 9.69443470e-02, -7.11727366e-02, -4.28451747e-02],
[ 2.50954758e-02, -3.50896828e-02, -6.06248528e-02]],
[[-4.41576634e-03, -3.03974133e-02, 3.74333374e-02],
[ 2.65656877e-02, -5.15482500e-02, 2.55387109e-02],
[ 4.18888927e-02, -7.42964670e-02, 1.65963285e-02],
...,
[ 5.96264340e-02, -6.26873225e-02, 4.92919178e-04],
[ 7.69837126e-02, -4.49479558e-02, -2.73446627e-02],
[ 4.12025116e-02, -1.18885487e-02, -3.75647955e-02]]],
[[[ 2.21067071e-02, -1.11255171e-02, 2.79338863e-02],
[ 3.44552584e-02, -3.55523229e-02, 2.96750609e-02],
[ 2.81813368e-02, -5.43026328e-02, 3.58022302e-02],
...,
[ 1.64286569e-02, -2.56849099e-02, 1.86677016e-02],
[ 8.29113834e-03, -2.09341552e-02, 1.19914617e-02],
[ 9.81146120e-04, -9.67385620e-03, -4.16056439e-03]],
[[ 3.28787938e-02, -1.61599461e-02, 5.14922813e-02],
[ 6.91715330e-02, -4.29286025e-02, 4.98180874e-02],
[ 6.63172528e-02, -6.69079795e-02, 5.72132170e-02],
...,
[ 3.04000657e-02, -2.18527243e-02, 1.85649637e-02],
[ 1.70135573e-02, -1.09965997e-02, 8.97353794e-03],
[ 4.53805597e-03, -5.36913285e-03, -1.20095760e-02]],
[[ 2.78143492e-02, -1.96416155e-02, 6.09582961e-02],
[ 7.77031854e-02, -4.75680716e-02, 6.22547753e-02],
[ 7.32004791e-02, -7.40943998e-02, 5.90784885e-02],
...,
[ 3.82998213e-02, -2.72356309e-02, 1.36684459e-02],
[ 2.71414351e-02, -1.12283370e-02, 6.70646504e-03],
[ 9.00707394e-03, -1.34321274e-02, -1.47624528e-02]],
...,
[[ 1.46386446e-02, -5.67819588e-02, 6.60154596e-02],
[ 7.21150413e-02, -7.36338943e-02, 5.58988042e-02],
[ 7.18126446e-02, -1.06876150e-01, 3.95014845e-02],
...,
[ 8.86510685e-02, -7.16847479e-02, -5.57619939e-03],
[ 7.46077448e-02, -6.13973439e-02, -5.14714606e-02],
[ 2.69034542e-02, -3.52577232e-02, -6.09280579e-02]],
[[ 1.07796416e-02, -5.97048812e-02, 6.08111545e-02],
[ 7.26786703e-02, -7.89055601e-02, 5.58755845e-02],
[ 7.88586289e-02, -1.12460330e-01, 4.03497480e-02],
...,
[ 8.51700082e-02, -8.18825141e-02, 6.48484449e-04],
[ 8.65846053e-02, -6.51570857e-02, -3.90671641e-02],
[ 2.17908174e-02, -3.32889743e-02, -5.32799624e-02]],
[[-5.24961576e-03, -3.11195180e-02, 3.85811515e-02],
[ 2.61344314e-02, -5.40103428e-02, 2.65447777e-02],
[ 4.28958423e-02, -7.67953098e-02, 1.69624444e-02],
...,
[ 5.64608611e-02, -5.83225712e-02, -9.28662426e-04],
[ 7.05104247e-02, -3.99363190e-02, -2.56322399e-02],
[ 3.73998210e-02, -1.12367878e-02, -3.46269831e-02]]]],
dtype=float32)>, <tf.Tensor: shape=(32,), dtype=int64, numpy=array([0, 1, 1, 1, 1, 0, 0, 1, 1, 0, 0, 1, 1, 0, 0, 1, 0, 1, 1, 1, 1, 0,
0, 1, 0, 1, 0, 1, 0, 0, 0, 0])>]
如果我使用.fit而不是使用train_on_batch,则会发生同一件事。但是.fit更快,因为我正在使用生成器。
此外,我有信心在fit / train_on_batch内部存在内存泄漏,因为输入是numpy数组,而不是张量
似乎内存不是线性增加而是出现尖峰。
我在光盘中使用了经过预训练的resnet_v2.ResNet50V2
PS,我不是100%肯定train_on_batch会发生什么,因为在打印gc.get_object的元素列表时,它遇到了一些困难,并且ram开始填充直到崩溃为止
答案
存在一个已知问题,即在循环中重复调用网络时,TF 2.x keras中会出现内存泄漏。
我在网上遇到了一些建议:
- 不时调用循环中的
tf.keras.backend.clear_session()
,可能还gc.collect()
(通过this question) - 用
train_on_batch
装饰器将@tf.function
或模型调用包裹在函数中(这对我有用)
以上是关于训练GAN时喀拉拉邦内存泄漏的主要内容,如果未能解决你的问题,请参考以下文章
Tensorflow.js 中的内存泄漏:如何清理未使用的张量?