Is it possible to spin up multiple Dask Clients within the same Python process and connect them to the same dashboard?
So I have for example one client using different cluster abstractions but both submitting task information to the same dashboard:
local_cluster = LocalCluster(
threads_per_worker=1,
n_workers=8
)
master_client = Client(
address=local_cluster,
set_as_default=True,
)
dashboard_url = master_client.dashboard_link
arvados_cluster = LSFCluster(
queue='long',
cores=24,
memory='1GB',
walltime='72:00',
job_extra_directives=[f'-o /tmp/job_out'] # No Mails
)
arvados_cluster.scale(1)
arvados_client = Client(
adress=arvados_cluster,
set_as_default=False,
dashboard_adress=dashboard_url. # This parameter would be needed
)