Solving GC Overhead Limit Exceeded Error

Hi everyone, i’m getting in some errors called GC Overhead Limit Exceeded Error by Java, when running a single task in Nextflow. I’d like to understand how i can fix this, since i’ve seen this could be related to Java allocation memory or Heap size, i don’t know. I’ve tried to making available the dynamic resource allocation directive in processes i needed, but same problem. Thanks all

Could you please try increasing cpu and memory for that particular process? Make sure it’s really getting your request for memory incrase (depending on the executor, it’s easy to veirfy).

Yes i’ve understood, but really i don’t think so .
I tried to getting available the dynamic memory allocation to give more memory usage after every task attempt.
I could also enable the cpus directive if its needed.
As i could see by htop command, this error includes the message ‘Java lang out of memory’ and then my workflow stops hitself. But in FASTQc process, for example, it gives this message when i handle several samples of big size for everyone. It’s like nextflow can’t manage this collect of input data as a waiting queue that has to be executed by process. In other cases the message ‘Java lang out of memory’ always exists, but it allows to perform my analyses in that time (Maybe only with a small number of samples parallelly)

Thank you