简体   繁体   English

如何增加 Jupyter notebook 内存限制?

[英]How to increase Jupyter notebook Memory limit?

I am using jupyter notebook with Python3 on windows 10. My computer has 8GB RAM and at least 4GB of my RAM is free.我在 Windows 10 上使用带有 Python3 的 jupyter notebook。我的电脑有 8GB 内存,至少有 4GB 的内存是免费的。

But when I want to make a numpy ndArray with size 6000*6000 with this command: np.zeros((6000, 6000), dtype='float64') I got this : Unable to allocate array with shape (6000, 6000) and data type float64但是当我想用这个命令创建一个大小为 6000*6000 的 numpy ndArray 时: np.zeros((6000, 6000), dtype='float64')我得到了这个: Unable to allocate array with shape (6000, 6000) and data type float64 np.zeros((6000, 6000), dtype='float64') Unable to allocate array with shape (6000, 6000) and data type float64

I don't think this could use more then 100MB RAM.我认为这不会使用超过 100MB 的 RAM。 I tried to change the number to see what happens.我试图改变号码看看会发生什么。 The biggest array I can make is (5000,5000).我可以制作的最大数组是 (5000,5000)。 Did I make a mistake in estimating how much RAM I need?我在估计需要多少 RAM 时出错了吗?

Jupyter notebook has a default memory limit size. Jupyter notebook 有一个默认的内存限制大小。 You can try to increase the memory limit by following the steps:您可以尝试按照以下步骤增加内存限制:
1) Generate Config file using command: 1)使用命令生成配置文件:

jupyter notebook --generate-config
2) Open jupyter_notebook_config.py file situated inside 'jupyter' folder and edit the following property: 2) 打开位于“jupyter”文件夹内的 jupyter_notebook_config.py 文件并编辑以下属性:
 NotebookApp.max_buffer_size = your desired value
Remember to remove the '#' before the property value. 请记住删除属性值之前的“#”。
3) Save and run the jupyter notebook. 3) 保存并运行 jupyter notebook。 It should now utilize the set memory value. 它现在应该使用设置的内存值。 Also, don't forget to run the notebook from inside the jupyter folder. 另外,不要忘记从 jupyter 文件夹内运行笔记本。


Alternatively, you can simply run the Notebook using below command:或者,您可以使用以下命令简单地运行 Notebook:

 jupyter notebook --NotebookApp.max_buffer_size=your_value jupyter notebook --NotebookApp.max_buffer_size=your_value

For Jupyter you need to consider 2 processes:对于 Jupyter,您需要考虑 2 个进程:

  1. The local HTTP server (which is based on Tornado)本地 HTTP 服务器(基于 Tornado)
  2. Kernel process (normally local but can be distributed and depends on your config).内核进程(通常是本地的,但可以分布式并取决于您的配置)。

max_buffer_size is a Tornado Web Server setting, corresponds to the Maximum amount of incoming data to buffer and defaults to 100MB (104857600).max_buffer_size是 Tornado Web 服务器设置,对应于要缓冲的最大传入数据量,默认为 100MB (104857600)。 ( https://www.tornadoweb.org/en/stable/httpserver.html ) ( https://www.tornadoweb.org/en/stable/httpserver.html )

Based on this PR , this value seems to have been increased to 500 MB in Notebook.基于这个PR ,这个值在 Notebook 中似乎已经增加到 500 MB。

Tornado HTTP server does not allow to my knowledge to define the max memory, it runs as a Python3 process.据我所知,Tornado HTTP 服务器不允许定义最大内存,它作为 Python3 进程运行。

For the kernel, you should look at the command defined kernel spec.对于内核,您应该查看命令定义的内核规范。

An option to try would be this one尝试一种选择将是这样一个

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM