Greetings all,
I am trying to execute my Nextflow pipeline with AWS Batch, however, it is crashing before the first process begins due to S3 access issues. Currently I am executing it on my local workstation with the following command:
nextflow run tutorial4.nf -profile awsbatch -bucket-dir s3://bucket-name/tests/
The error message is below:
Command error:
Caused by:
Essential container in task exited
Command executed:
Command exit status:
1
Command output:
(empty)
Command error:
download failed: s3://bucket-name/tests/9d/41b35e73e0d4223d29a37d7c582113/.command.run to - An error occurred (403) when calling the HeadObject operation: Forbidden
upload failed: ./.command.log to s3://bucket-name/tests/9d/41b35e73e0d4223d29a37d7c582113/.command.log An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
Work dir:
s3://bucket-name/tests/9d/41b35e73e0d4223d29a37d7c582113
My profile configuration looks like this:
awsbatch {
aws.accessKey = '##########'
aws.secretKey = '##########'
aws.region = '##########'
process.executor = 'awsbatch'
process.queue = '##########'
}
After receiving this error, I created a bucket policy for full access to S3 with my root/IAM user, but this hasn’t changed the error.
I am also confused why the API can create the temp directory in the bucket, which contains the .command.run and .command.sh files, but it can’t download or upload files?