Speculative execution?

Spark has a concept whereby you can mark certain tasks as being likely to have variable performance, so multiple copies of each task are submitted by the scheduler to workers, and the result of whichever finishes first used by dependents. Is there any way to trigger this behaviour in Dask? It sounds maybe like a scheduler plugin.

1 Like

Hi @martindurant! At the moment this isn’t supported, so feel free to open up an issue for a feature request. Worth noting the other way round is supported, so if multiple workers are running the same task the first to finish will be the accepted result. This only works if the results are identical, though.

1 Like