We've moved!
This site is now read-only. You can find our new documentation site and support forum for posting questions here.
Be sure to read our welcome blog!

Stream closed error with gatk 4.1.1.0

fazulurfazulur hyderabadMember

Dear GATK team,

I am getting the below error when running gatk-variant pipeline of bcbio. Bcbio using gatk 4.1.1.0 version.
When I run ApplyBQSRSpark using GATK 4.0.9.0, it runs fine without any issues.

Here is the command
**
gatk ApplyBQSRSpark --input test-sort.bam --output test-sort-recal.bam --bqsr-recal-file test-sort-recal.grp --static-quantized-quals 10 --static-quantized-quals 20 --static-quantized-quals 30 --spark-master local[8] --conf spark.local.dir=scratch/ --conf spark.driver.host=localhost --conf spark.network.timeout=800 --jdk-deflater --jdk-inflater**

Here is the error

[April 28, 2019 10:11:25 AM AST] org.broadinstitute.hellbender.tools.spark.ApplyBQSRSpark done. Elapsed time: 0.15 minutes.
Runtime.totalMemory()=874512384
htsjdk.samtools.util.RuntimeIOException: java.io.IOException: Stream closed
at htsjdk.samtools.IndexStreamBuffer.readFully(IndexStreamBuffer.java:23)
at htsjdk.samtools.IndexStreamBuffer.readLong(IndexStreamBuffer.java:62)
at htsjdk.samtools.AbstractBAMFileIndex.readLong(AbstractBAMFileIndex.java:436)
at htsjdk.samtools.AbstractBAMFileIndex.query(AbstractBAMFileIndex.java:311)
at htsjdk.samtools.CachingBAMFileIndex.getQueryResults(CachingBAMFileIndex.java:159)
at htsjdk.samtools.BAMIndexMerger.processIndex(BAMIndexMerger.java:43)
at htsjdk.samtools.BAMIndexMerger.processIndex(BAMIndexMerger.java:16)
at org.disq_bio.disq.impl.file.IndexFileMerger.mergeParts(IndexFileMerger.java:90)
at org.disq_bio.disq.impl.formats.bam.BamSink.save(BamSink.java:132)
at org.disq_bio.disq.HtsjdkReadsRddStorage.write(HtsjdkReadsRddStorage.java:225)
at org.broadinstitute.hellbender.engine.spark.datasources.ReadsSparkSink.writeReads(ReadsSparkSink.java:155)
at org.broadinstitute.hellbender.engine.spark.datasources.ReadsSparkSink.writeReads(ReadsSparkSink.java:120)
at org.broadinstitute.hellbender.engine.spark.GATKSparkTool.writeReads(GATKSparkTool.java:361)
at org.broadinstitute.hellbender.engine.spark.GATKSparkTool.writeReads(GATKSparkTool.java:349)
at org.broadinstitute.hellbender.tools.spark.ApplyBQSRSpark.runTool(ApplyBQSRSpark.java:90)
at org.broadinstitute.hellbender.engine.spark.GATKSparkTool.runPipeline(GATKSparkTool.java:528)
at org.broadinstitute.hellbender.engine.spark.SparkCommandLineProgram.doWork(SparkCommandLineProgram.java:30)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.runTool(CommandLineProgram.java:138)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMainPostParseArgs(CommandLineProgram.java:191)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:210)
at org.broadinstitute.hellbender.Main.runCommandLineProgram(Main.java:162)
at org.broadinstitute.hellbender.Main.mainEntry(Main.java:205)
at org.broadinstitute.hellbender.Main.main(Main.java:291)
Caused by: java.io.IOException: Stream closed
at java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:170)
at java.io.BufferedInputStream.read(BufferedInputStream.java:336)
at java.io.DataInputStream.read(DataInputStream.java:149)
at java.io.DataInputStream.read(DataInputStream.java:149)
at org.disq_bio.disq.impl.file.HadoopFileSystemWrapper$SeekableHadoopStream.read(HadoopFileSystemWrapper.java:232)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
at htsjdk.samtools.seekablestream.SeekableBufferedStream.read(SeekableBufferedStream.java:133)
at htsjdk.samtools.IndexStreamBuffer.readFully(IndexStreamBuffer.java:21)
... 22 more
19/04/28 10:11:25 INFO ShutdownHookManager: Shutdown hook called

Could you please help me to resolve this issue.

Thanks In Advance
Fazulur Rehaman

Issue · Github
by bhanuGandham

Issue Number
5919
State
open
Last Updated

Answers

Sign In or Register to comment.