If I understood well, /bin/grandsonofvictor.py is not inside the container image, but at the host operating system. It should be in a folder named bin in the project folder of the Nextflow pipeline. You seem to be providing it from the root (/) as /bin/grandsonofvictor.py. Try putting it at ./bin/grandsonofvictor.py and giving it +x permission (chmod +x ./bin/grandsonofvictor.py. By doing these two things, Nextflow will automatically make sure grandsonofvictor.py is accessible from within the container image. You don’t need the ./bin/ when calling it in the process block.
Ideally, you just let Nextflow take care of staging/moving files around. You shouldn’t have to worry about this.
I recommend you have a look at this section of the foundational training. It will help you understand what I meant in my previous post. Let me know if it’s not solved by then
Thanks for sharing the link, I’ve combed through it before posting on the forum. It doesn’t solve the issue I’m at. Instead of command line, I mention docker image inside the process interested as there are multiple processes with different docker images.
Sorry, if my post wasn’t clearer - the docker has a bin folder and then a script there as sonofwhatever.py
How do I check that nextflow is inside or activated the docker container? Is there a way to list ls contents there?
Finally, how do I have that script inbin folder at docker run?
If you’re sure the container image has a script named sonofwhatever.py at /bin/, and have configured Nextflow properly to use Docker with this container image for a specific process, I don’t see a reason for running into command not found. Can you share the container image so that I could try locally?
You can check the contents of the .command.run file within the task folder [of the process you’re interested in]. There should be docker command lines in it, such as in the nxf_launch function below: