Spawning processes in Workflow

Hello,

I am trying to use Nextflow for a high-throughput-computing task. Specifically, I would like to spawn a process Model and let the workflow continue. In the Model process the a model is served using a container and we can query the model through the HTTP protocol. In another process MCMC, I want to be able to querry the Model process via HTTP, and perform some computation. After MCMC has completed, I would like to kill the Model process, and continue further with the workflow.

I have read through the documentation, training, and a lot of forums, but I could not find any solution to spawn a process, and control it’s termination based on the completion of another process. Is this possible using nextflow. I’d be happy to how to achieve this use case.

Any ideas are welcome

Thank you
~Alan

The main issue here is that communication between processes is supposed to be directional, and processes are supposed to be isolated, so you would need to set up a communication channel before starting Model and MCMC. This already starts to cause problems because processes are supposed to be asynchronous and can cause a process deadlock because receiver wants to communicate with a process that hasn’t started for example. Then there’s also the clean up of the communication channel too.

Named pipe example, that’s not recommended to use.

workflow {
    MKFIFO()
    SENDER( params.message, MKFIFO.out.pipe ) 
    RECEIVER ( MKFIFO.out.pipe ) | view
}

process MKFIFO {
    script:
    """
    mkfifo mypipe
    """

    output:
    path "mypipe", emit: pipe
}

process SENDER {
    input:
    val message
    path pipename

    script:
    """
    echo $message > $pipename
    """

    output:
    path pipename
    
}

process RECEIVER {
    input:
    path pipename

    script:
    """
    cat $pipename
    """

    output:
    stdout
}

Thank you @mahesh.binzerpanchal!
I understand that nextflow is not designed for orchestrating services and follows the data flow paradigm. But this named pipe example will definitely get me started. :slight_smile:
I plan on using some logic in the script of the RECEIVER to not start until the named pipe contains a specific message (eg. model=1). And also do this vice versa. Add the some logic to the SENDER to stop execution on receiving a specific message on the named pipe (eg mcmc=0).

Could you elaborate on what needs to be done to clean up the MKFIFO process after both SENDER and RECEIVER have completed?

Please find my reference solution here:

I’d be happy if you can comment on any obvious gotcha’s.

Well, the MKFIFO process needs to exit in order to send output to the SENDER and RECEIVER processes so the task instance has already completed. Normally one should delete the file afterwards, but you can’t go back into a completed task like that.
The only option is to run nextflow clean afterwards or use cleanup = true in the config to clean up the dangling named pipes.

That makes total sense. Thank you for the tip :slight_smile:

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.