Notice:
If you happen to see a question you know the answer to, please do chime in and help your fellow community members. We encourage our fourm members to be more involved, jump in and help out your fellow researchers with their questions. GATK forum is a community forum and helping each other with using GATK tools and research is the cornerstone of our success as a genomics research community.We appreciate your help!

Test-drive the GATK tools and Best Practices pipelines on Terra


Check out this blog post to learn how you can get started with GATK and try out the pipelines in preconfigured workspaces (with a user-friendly interface!) without having to install anything.

BQSR can‘t run!

Hello!
I’m running BQSR with GATK v4.0.4.0. And I met some problems that I cannot get the right output file.
The command line is:
$GATK --java-options "-Xmx10240m -Djava.io.tmpdir=./" BaseRecalibratorSpark -R $GENOME -I $sample-md_rl.bam --known-sites /home/gaotiangang/niuguohao/201806call/50100-step5-3/combined_1.raw_snp.vcf -O $sample.4.table --spark-master local[4]
$GATK --java-options "-Xmx10240m -Djava.io.tmpdir=./" ApplyBQSRSpark -I $sample-md_rl.bam -bqsr $sample.4.table -O $sample.4.bam --spark-master local[4]
echo "BQSR 1 over "

But I got these lines back almost everytime ,
18/07/25 08:13:23 ERROR Executor: Exception in task 9.0 in stage 1.0 (TID 191)
java.io.IOException: Failed to create local dir in /home/gaotiangang/niuguohao/201806call/50-step5-7/niuguohao/blockmgr-accb99db-cd04-4a44-a018-0672408e3f03/1d.
at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:80)
at org.apache.spark.shuffle.IndexShuffleBlockResolver.getDataFile(IndexShuffleBlockResolver.scala:55)
at org.apache.spark.shuffle.sort.UnsafeShuffleWriter.closeAndWriteOutput(UnsafeShuffleWriter.java:212)
at org.apache.spark.shuffle.sort.UnsafeShuffleWriter.write(UnsafeShuffleWriter.java:169)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
at org.apache.spark.scheduler.Task.run(Task.scala:108)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
18/07/25 08:13:23 INFO TaskSetManager: Starting task 11.0 in stage 1.0 (TID 193, localhost, executor driver, partition 11, PROCESS_LOCAL, 4906 bytes)
18/07/25 08:13:23 INFO Executor: Running task 11.0 in stage 1.0 (TID 193)
18/07/25 08:13:23 WARN TaskSetManager: Lost task 9.0 in stage 1.0 (TID 191, localhost, executor driver): java.io.IOException: Failed to create local dir in /home/gaotiangang/niuguohao/201806call/50-step5-7/niuguohao/blockmgr-accb99db-cd04-4a44-a018-0672408e3f03/1d.
at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:80)
at org.apache.spark.shuffle.IndexShuffleBlockResolver.getDataFile(IndexShuffleBlockResolver.scala:55)
at org.apache.spark.shuffle.sort.UnsafeShuffleWriter.closeAndWriteOutput(UnsafeShuffleWriter.java:212)
at org.apache.spark.shuffle.sort.UnsafeShuffleWriter.write(UnsafeShuffleWriter.java:169)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
at org.apache.spark.scheduler.Task.run(Task.scala:108)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Does that mean I set something wrong in my command line?

Thanks !

Tagged:

Answers

Sign In or Register to comment.