Complete this survey about your research needs and be entered to win an Amazon gift card or FireCloud credit.
Read more about it here!
Download the latest Picard release at https://github.com/broadinstitute/picard/releases.
GATK version 4.beta.6 is out. See the GATK4 beta page for download and details.

Temporary files

I'm running MuTect as such:

java -Xmx16g -Djava.io.tmpdir=/scratch/MuTect/23/ -jar muTect-1.1.4.jar --analysis_type MuTect --dbsnp dbsnp_137.hg19.vcf --cosmic hg19_cosmic_v54_120711.vcf --input_file:normal Normal.bam --input_file:tumor Tumor.bam --reference_sequence ucsc.hg19.fasta --out out/23.txt --vcf out/23.vcf --enable_extended_output -nt 14 > out/logs/23.log

I'm having trouble with this command freezing after about 7 hours of running time. Looking at 'top', the threads are still running, but there is no more output (including in the log files). In trying to find out a cause (and that it isn't something in my local environment), I noticed that there was a huge number of temporary files:

> find /scratch/MuTect/23/ -type f  -name "org*.tmp" | wc -l
348444
> du -h /scratch/MuTect/23/
2.7G    /scratch/MuTect/23/

That seems like a huge number of files to be worrying about for a single pair of samples! The command does not always freeze; other samples have processed successfully.

Here's the end of the log:

> tail out/logs/23.log 
INFO  18:18:45,701 MuTect - [MUTECTOR] Processed -853994073 reads in 681 ms 
INFO  18:18:46,335 MuTect - [MUTECTOR] Processed -852994071 reads in 634 ms 
INFO  18:18:46,967 MuTect - [MUTECTOR] Processed -851993833 reads in 632 ms 
INFO  18:18:47,603 MuTect - [MUTECTOR] Processed -850993125 reads in 635 ms 
INFO  18:18:48,233 MuTect - [MUTECTOR] Inspected 4345000 potential candidates 
INFO  18:18:48,249 MuTect - [MUTECTOR] Processed -849992893 reads in 647 ms 
INFO  18:18:49,233 MuTect - [MUTECTOR] Processed -847992889 reads in 0 ms 
INFO  18:18:49,233 MuTect - [MUTECTOR] Processed -848992891 reads in 984 ms 
INFO  18:18:50,742 MuTect - [MUTECTOR] Processed -846992738 reads in 1509 ms 
INFO  18:19:10,796 ProgressMeter -   chrY:52600941        3.11e+09    8.0 h        9.3 s     98.5%         8.1 h     7.5 m 

Best Answer

Answers

  • The "-nt" flag was the likely culprit. I'll look into using Queue; but my runs don't take that long on a single processor anyways. Thanks @kcibul!

Sign In or Register to comment.