简体   繁体   中英

Jupyter notebook is extremely slow when re-running cells

I have a relatively large Jupyter/Notebook (about 40GB of Pandas DFs in RAM). I'm running a Python 3.6 kernel installed with Conda.

I have about 115 cells that I'm executing. If I restart the kernel and run the cells, my whole notebook runs in about 3 minutes. If I re-run a simple cell that's not doing much work (ie a function definition), it takes an extremely long time to execute (~15 minutes).

I cannot find any documentation online that has Jupyer notebook installation best practices. My disk usage is low, available RAM is high and CPU load is very low.

My swap space does seem to be maxed out, but I'm not sure what would be causing this.

Any recommendations on troubleshooting a poor-performing Jupyter notebook server? This seems to be related to re-running cells only.

If the Variable Inspector nbextension is activated, it might slow down the notebook when you have large variables in memory (such as your Pandas dataframes).

See: https://github.com/ipython-contrib/jupyter_contrib_nbextensions/issues/1275

If that's the case, try disabling it in Edit -> nbextensions config .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM