[英]Jupyter Notebook Kernal Keeps dying - low memory?
I am trying two different lines of code that both involve computing combinations of rows of a df with 500k rows. 我正在尝试两行不同的代码行,它们都涉及计算df行与500k行的组合。
I think bc of the large # of combinations, the kernal keeps dying. 我认为,在大量组合中,核心一直在消亡。 Is there anyway to resolve this ? 反正有解决办法吗?
Both lines of code that crash are 崩溃的两行代码都是
pd.merge(df.assign(key=0), df.assign(key=0), on='key').drop('key', axis=1)
and 和
index_comb = list(combinations(df.index, 2))
Both are different ways to achieve same desired df but kernal fails on both. 两者都是实现相同期望df的不同方法,但两者均无法实现。
Would appreciate any help :/ 将不胜感激:/
Update: I tried using the code in my terminal and it gives me an error of killed 9: it is using too much memory in terminal as well? 更新:我尝试在终端中使用该代码,这给了我一个错误,杀死了9:在终端中也使用了过多的内存?
There is no solution here that I know of. 我知道这里没有解决方案。 Jupyter Notebook simply is not designed to handle huge quantities of data. Jupyter Notebook根本不旨在处理大量数据。 Compile your code in a terminal, that should work. 在终端中编译您的代码,应该可以。
In case you run into the same problem when using a terminal look here: Python Killed: 9 when running a code using dictionaries created from 2 csv files 如果使用终端时遇到相同的问题,请看这里: Python Killed:9使用从2个csv文件创建的字典运行代码时
Edit: I ended up finding a way to potentially solve this: Increasing your container size should prevent Jupyter from running out of memory. 编辑:我最终找到了一种可能解决此问题的方法:增加容器大小应防止Jupyter内存不足。 In order to do so open the settings.cfg
file of jupyter in the home Directory of your Notebook $CHORUS_NOTEBOOK_HOME
The line to edit is this one: 为此,请在您的笔记本计算机$CHORUS_NOTEBOOK_HOME
的主目录中打开jupyter的settings.cfg
文件。
#default memory per container
MEM_LIMIT_PER_CONTAINER=“1g”
The default value should be 1 gb per container, increasing this to 2 or 4 gb should help with memory related crashes. 默认值为每个容器1 gb,将其增加到2或4 gb应该有助于解决与内存相关的崩溃。 However I am unsure of any implications this has on performance, so be warned! 但是,我不确定这会对性能产生任何影响,因此请注意!
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.