简体   繁体   中英

Keep jupyter lab notebook running when SSH is terminated on local machine?

I would like to be able to turn off my local machine while having my code continuously running in jupyter lab and come back to it running later, however as soon as the SSH is terminated, the jupyter lab kernel is stopped. My code also stops executing when I close the jupyter lab browser tab.

From the Google Cloud Platform marketplace I'm using a 'Deep Learning VM'. From there, I SSH to it through the suggested gcloud command (Cloud SDK) gcloud compute ssh --project projectname --zone zonename vmname -- -L 8080:localhost:8080 . It then opens a PuTTY connection to the VM and automatically has jupyter lab running that I can now access on local host.

What can I do to be able to run my code with my local machine off in this case?

I usually use "nohup" when using jupter notebook through ssh!

:~$ nohup jupyter notebook --ip=0.0.0.0 --port=xxxx --no-browser &

you can know more about it here

Hope it helps!

You can use Notebook remote execution. Basically your Notebook code will run in a remote machine and results will be stored there or in GCS for later view.

You have the following options:

  • nbconvert based options:

    • nbconvert : Provides a convenient way to execute the input cells of an.ipynb notebook file and save the results, both input and output cells, as a.ipynb file.

    • papermill : is a Python package for parameterizing and executing Jupyter Notebooks. (Uses nbconvert --execute under the hood.)

    • notebook executor : This tool that can be used to schedule the execution of Jupyter notebooks from anywhere (local, GCE, GCP Notebooks) to the Cloud AI Deep Learning VM. You can read more about the usage of this tool here. (Uses gcloud sdk and papermill under the hood)

  • Notebook training tool

Python package allows users to run a Jupyter notebook at Google Cloud AI Platform Training Jobs.

  • AI Platform Notebook Scheduler

This is in Alpha (Beta soon) with AI Platform Notebooks and the recommended option. Allows you scheduling a Notebook for recurring runs follows the exact same sequence of steps, but requires a crontab-formatted schedule option.

There are other options which allow you to execute Notebooks remotely:

  • tensorflow_cloud (Keras for GCP) Provides APIs that will allow to easily go from debugging and training your Keras and TensorFlow code in a local environment to distributed training in the cloud.

  • GCP runner Allows running any Jupyter notebook function on Google Cloud Platform Unlike all other solutions listed above, it allows to run training for the whole project, not single Python file or Jupyter notebook

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM