简体   繁体   English

为什么导入numpy会在Linux上添加1 GB的虚拟内存?

[英]Why does importing numpy add 1 GB of virtual memory on Linux?

I have to run python in a resource constrained environment with only a few GB of virtual memory. 我必须在资源受限的环境中运行python,只有几GB的虚拟内存。 Worse yet, I have to fork children from my main process as part of application design, all of which receive a copy-on-write allocation of this same amount of virtual memory on fork. 更糟糕的是,作为应用程序设计的一部分,我必须从我的主进程中派出子进程,所有这些都在fork上接收相同数量的虚拟内存的写时复制分配。 The result is that after forking only 1 - 2 children, the process group hits the ceiling and shuts everything down. 结果是,在仅分配了1-2个孩子之后,过程组击中了天花板并关闭了所有东西。 Finally, I am not able to remove numpy as a dependency; 最后,我无法删除numpy作为依赖; it is a strict requirement. 这是一个严格的要求。

Any advice on how I can bring this initial memory allocation down? 关于如何降低初始内存分配的任何建议?

eg 例如

  1. Change the default amount allocated to numpy on import? 导入时更改分配给numpy的默认金额?
  2. Disable the feature and force python / numpy to allocate more dynamically? 禁用该功能并强制python / numpy动态分配?


Details: 细节:

Red Hat Enterprise Linux Server release 6.9 (Santiago) 红帽企业Linux服务器版本6.9(圣地亚哥)
Python 3.6.2 Python 3.6.2
numpy>=1.13.3 numpy的> = 1.13.3

Bare Interpreter: 裸露的翻译:

import os
os.system('cat "/proc/{}/status"'.format(os.getpid()))

# ... VmRSS: 7300 kB
# ... VmData: 4348 kB
# ... VmSize: 129160 kB

import numpy
os.system('cat "/proc/{}/status"'.format(os.getpid()))

# ... VmRSS: 21020 kB
# ... VmData: 1003220 kB
# ... VmSize: 1247088 kB  

Thank you, skullgoblet1089, for raising questions on SO and at https://github.com/numpy/numpy/issues/10455 , and for answering. 谢谢你,skullgoblet1089,提出关于SO和https://github.com/numpy/numpy/issues/10455的问题 ,并回答。 Citing your 2018-01-24 post: 引用你的2018-01-24帖子:

Reducing threads with export OMP_NUM_THREADS=4 will bring down VM allocation. 通过export OMP_NUM_THREADS=4减少线程将降低VM分配。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 为什么我的 jupyter kernel 在导入 numpy 时在虚拟环境中运行时死机? - Why does my jupyter kernel die running in a virtual environment when importing numpy? 为什么在8GB内存的macOS计算机上使用352GB NumPy ndarray? - Why can a 352GB NumPy ndarray be used on an 8GB memory macOS computer? 为什么6GB的csv文件无法以numpy的形式整体读取到内存(64GB) - Why 6GB csv file is not possible to read whole to memory (64GB) in numpy 为什么导入 numpy 会在屏幕上打印 2313? - Why does importing numpy prints 2313 to the screen? 尽管卸载了 NumPy,导入 NumPy 不会产生 ModuleNotFoundError。 为什么? - Despite uninstalling NumPy, importing NumPy does not produce a ModuleNotFoundError. Why? 为什么这款Keras型号需要超过6GB的内存? - Why does this Keras model require over 6GB of memory? 为什么 60GB 内存在 MySQL 连接器 fetchall() 上消失了? - Why does 60GB memory disappear on a MySQL connector fetchall()? 为什么导入numpy时python调用builtins.compile? - Why does python call builtins.compile when importing numpy? 为什么复制> = 16 GB的Numpy数组会将其所有元素设置为0? - Why does copying a >= 16 GB Numpy array set all its elements to 0? Pandas - 导入大小为4GB的CSV文件时出现内存错误 - Pandas - memory error while importing a CSV file of size 4GB
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM