Delaunay Calculation for Large Point Cloud

Hi experts,

We are supposed to calculate the Delaunay Triangle Mesh for a large 3d point could that have about 1m points. The method we use is sicpy Delaunay function. When the point count achieves 40k, the performance is too poor to accept. We wonder if DASK can help in this situation?

Your advise is appreciated
Regards
DR, Ling

Hi @DRLing2021,

I’ve got some colleagues who tried to scale Delaunay algorithm, but from what I understand, it can’t really be distributed at its core. I’ve been told that here in France INRIA built a distributed Delaunay algorithm version and were planning to put it in the GDAL library, but I’ve not been able to find sources in a few minutes.

Maybe you could share some code snippet to understand what you are trying to achieve and how Dask could help?

Thank @guillaumeeb. Sorry for late reply as it has be crazy time recently. The key challenge we are facing here is performance. We are working with laser scanner data, similar to Lidar data, that is of huge size and unstructured. It is point cloud of coordinations. Making mesh from point cloud is a key step for many operations. And the size of the point count is a headache.

Where should one look to understand Delaunay algorithm and try to see if it can be parallelize somehow. It seems some people already try and had a hard time.