This section of the forum is no longer actively monitored. We are working on a support migration plan that we will share here shortly. Apologies for this inconvenience.
limit number of files during large workflow
I'm trying to run a joint genotyping workflow based of the example production workflows on github. I'm genotyping about 2500 exomes across the whole genome.
During the genotyping I started noticing a stall in the workflow. After consulting with the HPC admins, this is because the maximum number of inodes is reached during the workflow.
My question now is:
- How can I make cromwell pick up job failures when GATK fails because of a
No space left on deviceerrors
- Are there any config params I can tweak to limit the number of files generated/left over during the workflow? Is it possible, for example, to clear or tar the working directory after a job finishes successfully, keeping only the outputs?