Read & Write to 2 different S3 buckets

Hi, I have a collaboration where data is in an s3 bucket in the collaborator’s account. They have provided a set of read-only keys for us to mount the bucket. I want to run my analysis on this data and write results back to our own bucket with a different set of keys. We cannot find a way to setup 2 sets of s3 keys. We see a way to only store 1 set of keys. The difficulty is that we’re not using AWS s3 natively but via NetApp storagegrid so AWS IAM won’t work for us.

Hi mwe-bixeng,

Can you describe a little about where the Nextflow processes is being run? Are you running locally and writing to remote buckets, or are you running Nextflow on an AWS instance where you are inheriting the permissions of the instance?

Yes sure. We run nextflow in a few different places. At our national compute infrastructure which is HPC based. And then we also run them in AWS. We would like to read and write to remote buckets from an AWS instance. This can be done already. However, currently, the bucket has to be the same for both read and write. Most of the time, the read buckets belong to some other group (data share) so the write needs to go somewhere else which we couldn’t configure because of the single key-store.