I use the following to create a local cluster from a Jupyter notebook :
from dask.distributed import Client, LocalCluster
cluster = LocalCluster(n_workers=24)
c = Client(cluster)
Is it possible to connect from another notebook when the kernel is occupied (compute operation) ?
My goal is to access to 'total_occupancy' for example.
As suggested by @moshevi you can connect to the scheduler by providing the address.
client = Client("address-of-scheduler")
Then you can use the client.run_on_scheduler
command to execute operations on the remote scheduler
client.run_on_scheduler(lambda dask_scheduler: dask_scheduler.total_occupancy)
https://docs.dask.org/en/latest/futures.html#distributed.Client.run_on_scheduler
You could connect to the running cluster:
c_diffrent_notebook = Client('127.0.0.1:8786') # '127.0.0.1:8786' is the default
I would advice to explicitly specify the host in the original cluster and no rely on the default.
you can access the scheduler via the clients cluster:
c_diffrent_notebook.cluster.scheduler.total_occupancy
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.