将numpy数组分配给新的numpy数组不会释放旧数组的内存

如果我运行下面的代码,我希望第二次分配array_1会释放所使用的旧内存array_1,但即使强制执行gc,它似乎也会保留一个副本。示例:

import numpy as np
import psutil
import os
import gc

BIG_ARRAY = 1280*3*960
process = psutil.Process(os.getpid())

def make_frames(num):
    frames = []
    for i in range(int(num)):
        frames.append(np.arange(BIG_ARRAY).astype(float))
    return frames

def print_current_memory_usage(msg=''):
    process = psutil.Process(os.getpid())
    print('Using {:.2} GBs of mem for {}'.format(process.memory_info().rss/1E9,msg))


print_current_memory_usage('Initial mem')
array_1 = make_frames(1E2)
print_current_memory_usage('Mem after making large object')
array_1 = make_frames(1E2)
print_current_memory_usage('Mem after replacing old large object (expect mem to stay same)')
gc.collect()
print_current_memory_usage('Mem gc (expect to change if this is just a gc timing issue)')

我反复得到以下输出:

Using 0.028 GBs of mem for Initial mem 
Using 3.0 GBs of mem for Mem after making large object 
Using 5.9 GBs of mem for Mem after replacing old large object (expect mem to stay same) 
Using 5.9 GBs of mem for Mem gc (expect to change if this is just a gc timing issue)

好像没有释放初始array_1副本。有人可以解释为什么会这样吗?

请注意,只有在附加数组是numpy数组的情况下,如果我只是附加一个列表,即说frames.append([i for i in range(BIG_ARRAY)]),我就会得到预期的结果:

Using 0.028 GBs of mem for Initial mem
Using 4.1 GBs of mem for Mem after making large object
Using 4.1 GBs of mem for Mem after replacing old large object (expect mem to stay same)
Using 4.1 GBs of mem for Mem gc (expect to change if this is just a gc timing issue)
daihongjian8307 回答:将numpy数组分配给新的numpy数组不会释放旧数组的内存

暂时没有好的解决方案,如果你有好的解决方案,请发邮件至:iooj@foxmail.com
本文链接:https://www.f2er.com/3155857.html

大家都在问