简体   繁体   中英

How to increase jupyter notebook max buffer size in google cloud AI platform?

I am running Jupyter notebook on Google Cloud Platform. I have a big pickeld Dataframe to read. Since default buffer size of Jupyter notebook is around 0.5 Gb, it crashes and restarts kernel. I have added NotebookApp.max_buffer_size='my desired value' inside the jupyter_notebook_config.py in the Compute Engine but still problem is there.

I think resizing the instance is supposed to change memory limits, but I got same problem resizing to a bigger instance, here is my workaround:

I would recommend you save your work before trying this, just in case.

Jupyter runs as a service, so open its service config:

sudo vim /lib/systemd/system/jupyter.service;

You should see this config:

[Unit]
Description=Jupyter Notebook

[Service]
Type=simple
PIDFile=/run/jupyter.pid
MemoryHigh=34359738368
MemoryMax=34359738368
ExecStart=/bin/bash --login -c '/opt/conda/bin/jupyter lab --config=/home/jupyter/.jupyter/jupyter_notebook_config.py'
User=jupyter
Group=jupyter
WorkingDirectory=/home/jupyter
Restart=always

[Install]
WantedBy=multi-user.target

Change MemoryHigh and MemoryMax to desired values in bytes.

Then:

sudo systemctl daemon-reload
sudo systemctl restart jupyter

Wait a little for jupyter to restart and you should be good to go.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM