Submit jobs to dask cluster with additional dependency

I have a function that depends on a local project. Since the local project is actively developed in our group, we usually use it as a project dependency instead of a package. I am wondering if it is possible to submit a function along with the local project to Dask cluster so that the project is only temporarily available to the function while the job is running? I tried the client.upload_file approach that programmatically makes the local project a wheel and updates it before submitting a job. However, I noticed an error message “zipimport.ZipImportError: bad local file header: ‘/tmp/dask-worker-space/worker-ic9bri0d/deepsea_core-2.0.5-py3.9.egg’” when the file is re-uploaded. Looks like the file is on the dask worker permanently, which defeats the purpose that submitting jobs with dependencies on the fly.