Upscaling an image with dask image leads to blurry result

I am trying to upscale a portion of an image (200 x 200 pixels) with dask_image, into an image of (400 x 400) pixels. See the result in the picture.

The tile (400 x 400 pixels) that I obtain is blurry, even if the queried region below is smaller. I am using the following code:

kwargs = {"prefilter": False, "order": 0}
# tried also these, and other combinations
# kwargs = {"prefilter": True}
# kwargs = {}

# resample the image
transformed_dask = dask_image.ndinterp.affine_transform(
    xdata.data,
    matrix=matrix,
    output_shape=output_shape,
    **kwargs,
)

Is there a way to get a sharper result? Intuitively I would expect that if I resample an image of low res into a higher res one, I should not get an image blurrier quality.

The use case I have is generating tiles for deep learning, in which I am making the query in natural coordinates (micrometers) and I am getting tiles of the desired size in pixels. At the moment, because of the described behavior, I can get sharp images only if the micrometers that I query, converted in pixels, correspond exactly to the pixel size of the tile that I ask for, which is not the general case.

Hi @LucaMarconato, welcome to this forum!

Did you try your example with Scipy ndimage functions on which dask-image is built? Do you get a different behavior?

Could you also give a Complete Minimum Reproducible Example, including some data? This would then be much easier to investigate.

Hi @guillaumeeb thank you for the welcome and the answer.

I have made a minimal working example, but I can’t reproduce the bug, now I am suspecting it may be unrelated to dask_image.

I’ll give updates later.

For completeness here is the example (but everything works as expected).

import numpy as np
from skimage import data
from dask_image.ndinterp import affine_transform as affine_transform_dask
import dask.array as da
import matplotlib.pyplot as plt
from xarray import DataArray

image = data.astronaut()
big_image = DataArray(da.tile(image, (100, 100, 1)), dims=["y", "x", "c"])

crop_start = 1150
crop_end = 1250
# crop_start = 1300
# crop_end = 1400
crop = big_image.sel({'x': slice(crop_start, crop_end), 'y': slice(crop_start, crop_end)}).compute()

scale_factor = 4.5
matrix = np.array(
    [
        [1. / scale_factor, 0, 0, crop_start],
        [0, 1. / scale_factor, 0, crop_start],
        [0, 0, 1, 0],
        [0, 0, 0, 1],
    ]
)
output_shape = ((crop_end - crop_start) * scale_factor, (crop_end - crop_start) * scale_factor, 3)
image_transformed = affine_transform_dask(
    big_image.data,
    matrix=matrix,
    output_shape=output_shape,
)

plt.figure(figsize=(20, 10))
plt.subplot(1, 2, 1)
plt.imshow(crop)
plt.subplot(1, 2, 2)
plt.imshow(image_transformed.compute())
plt.show()

Update. Indeed the bug was in other parts of the code, dask-image works like a charm! Sorry for the inconvenience, I should have made the minimal example before asking.

1 Like