Getting error " CreateBucket operation: Access Denied" while writing dask dataframe to s3 df.to_csv(s3_file_path)

,

Issue : getting error " CreateBucket operation: Access Denied" while writing dask dataframe to s3 df.to_csv(s3_file_path)

description :

ddf.to_csv(preprocessed_file_loc,index=False)

I am trying to write dask dataframe to csv at location which is there on the s3 ,
location look like this : s3://{bucketname}/poc/customers/preprocessed/dummy_sample-*.csv
but I am getting an error

Error Trace :
08:37:28.422 | ERROR | Flow run ‘rapid-hyrax’ - Finished in state Failed(‘Flow run encountered an exception. Traceback (most recent call last):\n File “/usr/local/lib/python3.10/site-packages/s3fs/core.py”, line 113, in _error_wrapper\n return await func(*args, **kwargs)\n File “/usr/local/lib/python3.10/site-packages/aiobotocore/client.py”, line 371, in _make_api_call\n raise error_class(parsed_response, operation_name)\nbotocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the CreateBucket operation: Access Denied\n\nThe above exception was the direct cause of the following exception:\n\nPermissionError: Access Denied\n’) Traceback (most recent call last): File “/usr/local/lib/python3.10/site-packages/s3fs/core.py”, line 113, in _error_wrapper return await func(*args, **kwargs) File “/usr/local/lib/python3.10/site-packages/aiobotocore/client.py”, line 371, in _make_api_call raise error_class(parsed_response, operation_name) botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the CreateBucket operation: Access Denied

I tried everything but not working and did not find any solution on the internet, not sure what am I missing as this should be a trivial task

Hi @archit1012, welcome to Dask community!

How are you authenticating to your S3 server? Did you have a look to the storage_options you can use to configure your endpoint?

https://docs.dask.org/en/stable/how-to/connect-to-remote-data.html#amazon-s3