Hi everyone,
I was recommended to ask here in lieu of the nf-core slack channel:
I am currently tailoring a pipeline to run on azure batch. Doing this I can authenticate to an azure storage account with the azure plugin which allows running with that storage account linked to a batch account.
It is possible to use input from any storage container in this storage account as well as saving output to any storage container on this storage account.
Meanwhile our infrastructure is built in such a way that data is available in a different storage account, which we have no intention on linking to a batch account.
It is not crystal clear to me what the best way is to access files from this second azure storage account to be processed with nf azure batch plugin.
I have thought about reducing the problem by copying relevant files to the first storage account or perhaps to mount needed storage container in the nextflow headnode with blobfuse.
I would highly appreciate any experience/input/ideas on this, or what is the best channel to ask this question.
Best Regards
Jakob