Using Dask Distributed with numpy.f2py function in module

I’m trying to use dask distributed to distribute a custom module across multiple workers. The module uses numpy.f2py, and I’m getting errors running it:


/nobackupp11/tjnorman/dask-worker-space/worker-jxynhyrj/ in
11 sourcecode =
12 f2py.compile(sourcecode, modulename=‘Dmmex_R14B_4’)
—> 13 import Dmmex_R14B_4
14 import numpy as np
15 #import pandas as pd

ModuleNotFoundError: No module named ‘Dmmex_R14B_4’

Where Dmmex_R14B_4 is a fortran program being converted for use in python by numpy.f2py.
Does anyone has any suggestion on how to solve this problem?


Hi @tjnorman and welcome to discourse! As you’ve noted in the error message, the Dmmex_R14B_4 module is not accessible to the workers. In addition to the file /…/EAP_02/’, have you also already uploaded Dmmex_R14B_4 to the workers? Additionally, if you want to upload a more comprehensive module you can package it into a zip or egg file - see Futures — Dask documentation for more details.

1 Like

Thanks @scharlotte for your response. I tried your suggest and am encountering similar errors and same problem. When I use client.upload_file(‘/…/EAP_02/Dmmex_R14B_4.f’) to upload the file or upload all the relevant modules as a zip file I get the error:

ModuleNotFoundError: No module named ‘Dmmex_R14B_4’

Dmmex_R14B_4 is a fortran file. Can Dask read fortran files?

Hi @tjnorman thanks for the update!

Ah I see I think I misunderstood your question, I thought you had already converted Dmmex_R14B_4 to a Python module using numpy.f2py and were looking to load that module to the workers. Instead it sounds like you’re hoping to upload a fortran file to the workers, which will not be interpretable by Dask. When I try to do this, I get a warning: distributed.utils - WARNING - Found nothing to import from fib1.f:

from dask.distributed import Client, LocalCluster
cluster = LocalCluster()
client = Client(cluster)
# created from example

Apologies for the confusion!