Processes fail silently (and mysteriously) because of bash variable assigning

I’m trying to run an old pipeline (which is somehow outdated now) for transcriptome assembly and QC (TransPi). The pipeline is written in DSL1 scheme, so I’m using Nextflow v22.11.1-edge (long story, but that’s almost the only version that supports bot DSL1 and Apptainer).
I ran the pipeline before on an older HPC that used Singularity with Nextflow v21.04.1, but since shifting to this newer version (and an HPC that uses Slurm and Apptainer), a lot of the jobs have failed silently. I had to do a lot of digging and debugging and figured out the offensive piece of code (used to write the tools’ versions) at the end of the script:

v=\$( blastn -version | head -n1 | awk '{print \$2}' ) 
echo "Blast: \$v" >>evigene.version.txt 
v=\$( cd-hit -h | head -n1 | cut -f 1 -d "(" | cut -f 2 -d "n" ) 
echo "CD-HIT: \$v" >>evigene.version.txt 
v=\$( exonerate -v | head -n1 | cut -f 5 -d " " ) 
echo "Exonerate: \$v" >>evigene.version.txt

For some odd reason that I can’t understand, defining v or calling it in the next line generates a silent error (exit code 1), with no output. If I call the echo commands without assigning the variable v beforehand (like below), the error disappears and the process completes successfully.

echo "Exonerate: \$( exonerate -v | head -n1 | cut -f 5 -d " " )" >>evigene.version.txt

Any idea from the gurus here?