You can probably ignore the comment above…
It does appear to be working as intended. I just had to cull part of the task graph to make things look nice.
import dask.array as da
from dask.graph_manipulation import clone
from scipy.ndimage import gaussian_filter
test_arr = da.ones((32,32), chunks=(16,16))
overlapped_array = da.overlap.overlap(test_arr,depth=2, boundary=None)
filtered_overlapped = overlapped_array.map_blocks(gaussian_filter, sigma=1)
filtered = da.overlap.trim_overlap(filtered_overlapped, depth=2, boundary=None)
filtered.visualize()
overlapped_array_cloned = da.concatenate([clone(b, assume_layers=True) for b in overlapped_array.blocks]) # clone only one dimension of blocks
filtered_overlapped = overlapped_array_cloned.map_blocks(gaussian_filter, sigma=1)
filtered = da.overlap.trim_overlap(filtered_overlapped, depth=2, boundary=None)
optimize(filtered)[0].visualize()
So you can see that the second graph is a bit more embarrassingly parallel because the data creation is duplicated rather than reused many times. In one dimension the problem is entirely separable but in the other dimension we have a problem that still involves a lot of cross talk. If we take this to the extreme.
import numpy as np
from functools import reduce
from operator import mul
def reshape(lst, shape):
if len(shape) == 1:
return lst
n = reduce(mul, shape[1:])
return [reshape(lst[i*n:(i+1)*n], shape[1:]) for i in range(len(lst)//n)]
test_arr = da.ones((32,32), chunks=(16,16))
overlapped_array = da.overlap.overlap(test_arr,depth=2, boundary=None)
overlapped_array_cloned = da.block(reshape([clone(b, assume_layers=True) for b in overlapped_array.blocks.ravel()], (2,2))) # clone 2 dimensions of blocks
filtered_overlapped = overlapped_array_cloned.map_blocks(gaussian_filter, sigma=1)
filtered = da.overlap.trim_overlap(filtered_overlapped, depth=2, boundary=None)
optimize(filtered)[0].visualize()
The problem is now entirely embarrassingly parallel at the cost of each of our original keys being generated 4 separate times.
When I try to do this with data I loaded myself I get a weird error when trying to compute but when I do this with an array of ones I don’t get anything.
This appears to be related to Error when computing a cloned graph from xarray.open_dataset · Issue #9621 · dask/dask · GitHub
le ~/micromamba/envs/dev/lib/python3.9/site-packages/dask/array/core.py:120, in getter()
118 lock.acquire()
119 try:
--> 120 c = a[b]
121 # Below we special-case `np.matrix` to force a conversion to
122 # `np.ndarray` and preserve original Dask behavior for `getter`,
123 # as for all purposes `np.matrix` is array-like and thus
124 # `is_arraylike` evaluates to `True` in that case.
125 if asarray and (not is_arraylike(c) or isinstance(c, np.matrix)):
TypeError: string indices must be integers