Multiple Distributed Clients Combined on the same Dashboard

Is it possible to spin up multiple Dask Clients within the same Python process and connect them to the same dashboard?

So I have for example one client using different cluster abstractions but both submitting task information to the same dashboard:

local_cluster = LocalCluster(
    threads_per_worker=1, 
    n_workers=8
)
    
 master_client = Client(
      address=local_cluster,
      set_as_default=True,
 )
dashboard_url = master_client.dashboard_link

arvados_cluster = LSFCluster(
    queue='long',
    cores=24,
    memory='1GB', 
    walltime='72:00',
    job_extra_directives=[f'-o /tmp/job_out']  # No Mails
)
arvados_cluster.scale(1)
arvados_client = Client(
    adress=arvados_cluster,
    set_as_default=False,
    dashboard_adress=dashboard_url.  # This parameter would be needed
)

Hi @schulz-m, welcome to Dask discourse forum!

I don’t think this is possible, as the Dashboard is not linked to a Client, but to a Scheduler. If you launch multiple Scheduler (Cluster) instances, you’ll have one dashboard per Scheduler.

One way to achieve what you want would be to have a single Scheduler for all of your resources, but using dask-jobqueue with an existing Scheduler is not supported yet.

I don’t think multiple Schedulers could share information in the same Dashboard.

Hi @guillaumeeb

Thank you for the quick response!

Using a single scheduler with several “Cluster Abstractions” would also be great - would you know if there is any roadmap for that to be implemented? :slight_smile: