The current GATK version is 3.7-0
Examples: Monday, today, last week, Mar 26, 3/26/04

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Did you remember to?


1. Search using the upper-right search box, e.g. using the error message.
2. Try the latest version of tools.
3. Include tool and Java versions.
4. Tell us whether you are following GATK Best Practices.
5. Include relevant details, e.g. platform, DNA- or RNA-Seq, WES (+capture kit) or WGS (PCR-free or PCR+), paired- or single-end, read length, expected average coverage, somatic data, etc.
6. For tool errors, include the error stacktrace as well as the exact command.
7. For format issues, include the result of running ValidateSamFile for BAMs or ValidateVariants for VCFs.
8. For weird results, include an illustrative example, e.g. attach IGV screenshots according to Article#5484.
9. For a seeming variant that is uncalled, include results of following Article#1235.

Did we ask for a bug report?


Then follow instructions in Article#1894.

Formatting tip!


Surround blocks of code, error messages and BAM/VCF snippets--especially content with hashes (#)--with lines with three backticks ( ``` ) each to make a code block.
Powered by Vanilla. Made with Bootstrap.
Picard 2.9.0 is now available. Download and read release notes here.
GATK 3.7 is here! Be sure to read the Version Highlights and optionally the full Release Notes.

The error of Mark Dupplicate

cymcym ChinaMember Posts: 4

I run the step of Mark Duplicate in the server, but it always print an error message even though I changed server.
This is my input:
java -jar /home/yangguoqian/biosoft/picard-tools-2.5.0/picard.jar MarkDuplicates INPUT=/home/chenyunmei/hetero/moso_gatk/bam/Bamboo-PCRfree_Round68_Lane1.bam OUTPUT=/home/chenyunmei/hetero/moso_gatk/markDup/Bamboo-PCRfree_Round68_Lane1.dedupped.bam METRICS_FILE=/home/chenyunmei/hetero/moso_gatk/markDup/Bamboo-PCRfree_Round68_Lane1.dedupped.metrics.txt VALIDATION_STRINGENCY=LENIENT CREATE_INDEX=true REMOVE_DUPLICATES=TRUE

And the follow is the error message:
optimized capture of last three ':' separated fields as numeric values> OPTICAL_DUPLICATE_PIXEL_DISTANCE=100 VERBOSITY=INFO QUIET=false COMPRESSION_LEVEL=5 MAX_RECORDS_IN_RAM=500000 CREATE_MD5_FILE=false GA4GH_CLIENT_SECRETS=client_secrets.json
[Sat Jul 30 19:56:01 CST 2016] Executing as chenyunmei@localhost.localdomain on Linux 3.10.0-229.el7.x86_64 amd64; Java HotSpot(TM) 64-Bit Server VM 1.8.0_77-b03; Picard version: 2.5.0(2c370988aefe41f579920c8a6a678a201c5261c1_1466708365)
INFO 2016-07-30 19:56:01 MarkDuplicates Start of doWork freeMemory: 2045995552; totalMemory: 2058354688; maxMemory: 28631367680
INFO 2016-07-30 19:56:01 MarkDuplicates Reading input file and constructing read end information.
INFO 2016-07-30 19:56:01 MarkDuplicates Will retain up to 110120644 data points before spilling to disk.
[Sat Jul 30 19:56:12 CST 2016] picard.sam.markduplicates.MarkDuplicates done. Elapsed time: 0.18 minutes.
Runtime.totalMemory()=3755474944
To get help, see http://broadinstitute.github.io/picard/index.html#GettingHelp
Exception in thread "main" htsjdk.samtools.SAMException: /tmp/chenyunmei/CSPI.2016309087006492130.tmp/150462.tmpnot found
at htsjdk.samtools.util.FileAppendStreamLRUCache$Functor.makeValue(FileAppendStreamLRUCache.java:63)
at htsjdk.samtools.util.FileAppendStreamLRUCache$Functor.makeValue(FileAppendStreamLRUCache.java:49)
at htsjdk.samtools.util.ResourceLimitedMap.get(ResourceLimitedMap.java:76)
at htsjdk.samtools.CoordinateSortedPairInfoMap.getOutputStreamForSequence(CoordinateSortedPairInfoMap.java:180)
at htsjdk.samtools.CoordinateSortedPairInfoMap.put(CoordinateSortedPairInfoMap.java:164)
at picard.sam.markduplicates.util.DiskBasedReadEndsForMarkDuplicatesMap.put(DiskBasedReadEndsForMarkDuplicatesMap.java:65)
at picard.sam.markduplicates.MarkDuplicates.buildSortedReadEndLists(MarkDuplicates.java:449)
at picard.sam.markduplicates.MarkDuplicates.doWork(MarkDuplicates.java:193)
at picard.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:208)
at picard.cmdline.PicardCommandLine.instanceMain(PicardCommandLine.java:95)
at picard.cmdline.PicardCommandLine.main(PicardCommandLine.java:105)
Caused by: java.io.FileNotFoundException: /tmp/chenyunmei/CSPI.2016309087006492130.tmp/150462.tmp (Too many open files)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.(FileOutputStream.java:213)
at htsjdk.samtools.util.FileAppendStreamLRUCache$Functor.makeValue(FileAppendStreamLRUCache.java:60)
... 10 more

Please help me. I have waste a lot of time in this, but I really don't know what is the reason.

Answers

Sign In or Register to comment.