To celebrate the release of GATK 4.0, we are giving away free credits for running the GATK4 Best Practices pipelines in FireCloud, our secure online analysis portal. It’s first come first serve, so sign up now to claim your free credits worth $250. Sponsored by Google Cloud. Learn more at https://software.broadinstitute.org/firecloud/documentation/freecredits

The error of Mark Dupplicate

I run the step of Mark Duplicate in the server, but it always print an error message even though I changed server.
This is my input:
java -jar /home/yangguoqian/biosoft/picard-tools-2.5.0/picard.jar MarkDuplicates INPUT=/home/chenyunmei/hetero/moso_gatk/bam/Bamboo-PCRfree_Round68_Lane1.bam OUTPUT=/home/chenyunmei/hetero/moso_gatk/markDup/Bamboo-PCRfree_Round68_Lane1.dedupped.bam METRICS_FILE=/home/chenyunmei/hetero/moso_gatk/markDup/Bamboo-PCRfree_Round68_Lane1.dedupped.metrics.txt VALIDATION_STRINGENCY=LENIENT CREATE_INDEX=true REMOVE_DUPLICATES=TRUE

And the follow is the error message:
optimized capture of last three ':' separated fields as numeric values> OPTICAL_DUPLICATE_PIXEL_DISTANCE=100 VERBOSITY=INFO QUIET=false COMPRESSION_LEVEL=5 MAX_RECORDS_IN_RAM=500000 CREATE_MD5_FILE=false GA4GH_CLIENT_SECRETS=client_secrets.json
[Sat Jul 30 19:56:01 CST 2016] Executing as chenyunmei@localhost.localdomain on Linux 3.10.0-229.el7.x86_64 amd64; Java HotSpot(TM) 64-Bit Server VM 1.8.0_77-b03; Picard version: 2.5.0(2c370988aefe41f579920c8a6a678a201c5261c1_1466708365)
INFO 2016-07-30 19:56:01 MarkDuplicates Start of doWork freeMemory: 2045995552; totalMemory: 2058354688; maxMemory: 28631367680
INFO 2016-07-30 19:56:01 MarkDuplicates Reading input file and constructing read end information.
INFO 2016-07-30 19:56:01 MarkDuplicates Will retain up to 110120644 data points before spilling to disk.
[Sat Jul 30 19:56:12 CST 2016] picard.sam.markduplicates.MarkDuplicates done. Elapsed time: 0.18 minutes.
Runtime.totalMemory()=3755474944
To get help, see http://broadinstitute.github.io/picard/index.html#GettingHelp
Exception in thread "main" htsjdk.samtools.SAMException: /tmp/chenyunmei/CSPI.2016309087006492130.tmp/150462.tmpnot found
at htsjdk.samtools.util.FileAppendStreamLRUCache$Functor.makeValue(FileAppendStreamLRUCache.java:63)
at htsjdk.samtools.util.FileAppendStreamLRUCache$Functor.makeValue(FileAppendStreamLRUCache.java:49)
at htsjdk.samtools.util.ResourceLimitedMap.get(ResourceLimitedMap.java:76)
at htsjdk.samtools.CoordinateSortedPairInfoMap.getOutputStreamForSequence(CoordinateSortedPairInfoMap.java:180)
at htsjdk.samtools.CoordinateSortedPairInfoMap.put(CoordinateSortedPairInfoMap.java:164)
at picard.sam.markduplicates.util.DiskBasedReadEndsForMarkDuplicatesMap.put(DiskBasedReadEndsForMarkDuplicatesMap.java:65)
at picard.sam.markduplicates.MarkDuplicates.buildSortedReadEndLists(MarkDuplicates.java:449)
at picard.sam.markduplicates.MarkDuplicates.doWork(MarkDuplicates.java:193)
at picard.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:208)
at picard.cmdline.PicardCommandLine.instanceMain(PicardCommandLine.java:95)
at picard.cmdline.PicardCommandLine.main(PicardCommandLine.java:105)
Caused by: java.io.FileNotFoundException: /tmp/chenyunmei/CSPI.2016309087006492130.tmp/150462.tmp (Too many open files)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.(FileOutputStream.java:213)
at htsjdk.samtools.util.FileAppendStreamLRUCache$Functor.makeValue(FileAppendStreamLRUCache.java:60)
... 10 more

Please help me. I have waste a lot of time in this, but I really don't know what is the reason.

Answers

Sign In or Register to comment.