Hello!
I am trying to run a pipeline within Azure Batch, but whenever it runs I get the error that it was killed by an external system. All of the setup steps run fine (I.E. the creation of the workdir, .command.run, .command.sh, etc.), but no further files are made (no .command.log nor .exitcode)
I believe that this is due to a poor formation of the output file URL in the task created by nextflow. Here’s an example:
"outputFiles": [
{
"filePattern": ".exitcode",
"destination": {
"container": {
"path": "nextflow_work/79/2f613d86e51525e85889e3e8e6ec1f/.exitcode",
"containerUrl": "https://<censored>.blob.core.windows.net/<censored>?<censored>"
}
},
"uploadOptions": {
"uploadCondition": "TaskCompletion"
}
},
{
"filePattern": ".command.log",
"destination": {
"container": {
"path": "nextflow_work/79/2f613d86e51525e85889e3e8e6ec1f/.command.log",
"containerUrl": "https://<censored>.blob.core.windows.net/<censored>?<censored>"
}
},
"uploadOptions": {
"uploadCondition": "TaskCompletion"
}
}
],
When we attempt to look at the results on Azure, there is an error that the specified container does not exist. It seems like the .command.run and .command.sh are working fine.
My current config is capable of: Creating the work directories automatically, creating and uploading results to the storage container, and downloading input files from the storage container. However it cannot seem to use the blobs that it created once the node attempts to run.
I unfortunately don’t have the .command.log to attach, but I can attach the .nextflow.log.
.nextflow.log (54.4 KB)