Heads up:
We’re moving the GATK website, docs and forum to a new platform. Read the full story and breakdown of key changes on this blog.
Notice:
If you happen to see a question you know the answer to, please do chime in and help your fellow community members. We encourage our fourm members to be more involved, jump in and help out your fellow researchers with their questions. GATK forum is a community forum and helping each other with using GATK tools and research is the cornerstone of our success as a genomics research community.We appreciate your help!

Test-drive the GATK tools and Best Practices pipelines on Terra


Check out this blog post to learn how you can get started with GATK and try out the pipelines in preconfigured workspaces (with a user-friendly interface!) without having to install anything.

issues with CreateReadCountPanelOfNormals

jml96jml96 CambridgeMember

Hi,
I am try to implement the gatk4 copy number variants workflow but I get the error below in CreateReadCountPanelOfNormals step.
In bwa mem I used the options "-R "@RG\tID:H_SL-DCIS64-A1_NODE\tLB:H_SL-DCIS64-A1_NODE\tPL:illumina\tPU:H_SL-DCIS64-A1_NODE\tSM:H_SL-DCIS64-A1_NODE" to add the read groups in the bam file header. I also tried picard.jar AddOrReplaceReadGroups but I get the same error.
In the input bam file header the read groups is defined "@RG ID:H_SL-DCIS64-A1_NODE LB:H_SL-DCIS64-A1_NODE PL:illumina PU:H_SL-DCIS64-A1_NODE SM:H_SL-DCIS64-A1_NODE".
Do you know why I am getting this issue?
Thank you.

Regards,
João

12:00:06.706 WARN SparkContextFactory - Environment variables HELLBENDER_TEST_PROJECT and HELLBENDER_JSON_SERVICE_ACCOUNT_KEY must be set or the GCS hadoop connector will not be configured properly
12:00:06.758 INFO NativeLibraryLoader - Loading libgkl_compression.so from jar:file:/rds-d3/user/jml96/hpc-work/Software/gatk-4.1.2.0/gatk-package-4.1.2.0-local.jar!/com/intel/gkl/native/libgkl_compression.so
May 14, 2019 12:00:06 PM shaded.cloud_nio.com.google.auth.oauth2.ComputeEngineCredentials runningOnComputeEngine
INFO: Failed to detect whether we are running on Google Compute Engine.
12:00:06.999 INFO CreateReadCountPanelOfNormals - ------------------------------------------------------------
12:00:07.000 INFO CreateReadCountPanelOfNormals - The Genome Analysis Toolkit (GATK) v4.1.2.0
12:00:07.000 INFO CreateReadCountPanelOfNormals - For support and documentation go to https://software.broadinstitute.org/gatk/
12:00:07.001 INFO CreateReadCountPanelOfNormals - Executing as [email protected] on Linux v3.10.0-957.10.1.el7.x86_64 amd64
12:00:07.001 INFO CreateReadCountPanelOfNormals - Java runtime: OpenJDK 64-Bit Server VM v1.8.0_152-release-1056-b12
12:00:07.001 INFO CreateReadCountPanelOfNormals - Start Date/Time: May 14, 2019 12:00:06 PM BST
12:00:07.001 INFO CreateReadCountPanelOfNormals - ------------------------------------------------------------
12:00:07.001 INFO CreateReadCountPanelOfNormals - ------------------------------------------------------------
12:00:07.002 INFO CreateReadCountPanelOfNormals - HTSJDK Version: 2.19.0
12:00:07.002 INFO CreateReadCountPanelOfNormals - Picard Version: 2.19.0
12:00:07.002 INFO CreateReadCountPanelOfNormals - HTSJDK Defaults.COMPRESSION_LEVEL : 2
12:00:07.002 INFO CreateReadCountPanelOfNormals - HTSJDK Defaults.USE_ASYNC_IO_READ_FOR_SAMTOOLS : false
12:00:07.002 INFO CreateReadCountPanelOfNormals - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_SAMTOOLS : true
12:00:07.002 INFO CreateReadCountPanelOfNormals - HTSJDK Defaults.USE_ASYNC_IO_WRITE_FOR_TRIBBLE : false
12:00:07.002 INFO CreateReadCountPanelOfNormals - Deflater: IntelDeflater
12:00:07.002 INFO CreateReadCountPanelOfNormals - Inflater: IntelInflater
12:00:07.002 INFO CreateReadCountPanelOfNormals - GCS max retries/reopens: 20
12:00:07.002 INFO CreateReadCountPanelOfNormals - Requester pays: disabled
12:00:07.002 INFO CreateReadCountPanelOfNormals - Initializing engine
12:00:07.002 INFO CreateReadCountPanelOfNormals - Done initializing engine
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/05/14 12:00:07 INFO SparkContext: Running Spark version 2.2.0
19/05/14 12:00:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19/05/14 12:00:07 INFO SparkContext: Submitted application: CreateReadCountPanelOfNormals
19/05/14 12:00:07 INFO SecurityManager: Changing view acls to: jml96
19/05/14 12:00:07 INFO SecurityManager: Changing modify acls to: jml96
19/05/14 12:00:07 INFO SecurityManager: Changing view acls groups to:
19/05/14 12:00:07 INFO SecurityManager: Changing modify acls groups to:
19/05/14 12:00:07 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jml96); groups with view permissions: Set(); users with modify permissions: Set(jml96); groups with modify permissions: Set()
19/05/14 12:00:07 INFO Utils: Successfully started service 'sparkDriver' on port 44349.
19/05/14 12:00:07 INFO SparkEnv: Registering MapOutputTracker
19/05/14 12:00:07 INFO SparkEnv: Registering BlockManagerMaster
19/05/14 12:00:07 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19/05/14 12:00:07 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
19/05/14 12:00:07 INFO DiskBlockManager: Created local directory at /rds-d3/user/jml96/hpc-work/CNV_GATK4/Temp_files/blockmgr-551d6652-a87c-4df5-b6a9-08ad42191613
19/05/14 12:00:07 INFO MemoryStore: MemoryStore started with capacity 997.8 MB
19/05/14 12:00:07 INFO SparkEnv: Registering OutputCommitCoordinator
19/05/14 12:00:07 INFO Utils: Successfully started service 'SparkUI' on port 4040.
19/05/14 12:00:07 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.43.7.12:4040
19/05/14 12:00:07 INFO Executor: Starting executor ID driver on host localhost
19/05/14 12:00:07 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 40271.
19/05/14 12:00:07 INFO NettyBlockTransferService: Server created on 10.43.7.12:40271
19/05/14 12:00:07 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/05/14 12:00:07 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.43.7.12, 40271, None)
19/05/14 12:00:07 INFO BlockManagerMasterEndpoint: Registering block manager 10.43.7.12:40271 with 997.8 MB RAM, BlockManagerId(driver, 10.43.7.12, 40271, None)
19/05/14 12:00:07 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.43.7.12, 40271, None)
19/05/14 12:00:07 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.43.7.12, 40271, None)
12:00:07.815 INFO CreateReadCountPanelOfNormals - Spark verbosity set to INFO (see --spark-verbosity argument)
19/05/14 12:00:07 INFO HDF5Library: Trying to load HDF5 library from:
jar:file:/rds-d3/user/jml96/hpc-work/Software/gatk-4.1.2.0/gatk-package-4.1.2.0-local.jar!/org/broadinstitute/hdf5/libjhdf5.2.11.0.so
19/05/14 12:00:07 INFO H5: HDF5 library:
19/05/14 12:00:07 INFO H5: successfully loaded.
12:00:07.866 INFO CreateReadCountPanelOfNormals - Retrieving intervals from first read-counts file (/rds-d3/user/jml96/hpc-work/CNV_GATK4/Temp_files/H_SL-DCIS64-A1_NODE_chr17_1.bam)...
19/05/14 12:00:07 INFO SparkUI: Stopped Spark web UI at http://10.43.7.12:4040
19/05/14 12:00:07 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/05/14 12:00:07 INFO MemoryStore: MemoryStore cleared
19/05/14 12:00:07 INFO BlockManager: BlockManager stopped
19/05/14 12:00:07 INFO BlockManagerMaster: BlockManagerMaster stopped
19/05/14 12:00:07 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/05/14 12:00:07 INFO SparkContext: Successfully stopped SparkContext
12:00:07.943 INFO CreateReadCountPanelOfNormals - Shutting down engine
[May 14, 2019 12:00:07 PM BST] org.broadinstitute.hellbender.tools.copynumber.CreateReadCountPanelOfNormals done. Elapsed time: 0.02 minutes.
Runtime.totalMemory()=2058354688
java.lang.IllegalArgumentException: The collection is empty: The input header does not contain any read groups. Cannot determine a sample name.
at org.broadinstitute.hellbender.utils.Utils.nonEmpty(Utils.java:618)
at org.broadinstitute.hellbender.tools.copynumber.formats.metadata.MetadataUtils.readSampleName(MetadataUtils.java:23)
at org.broadinstitute.hellbender.tools.copynumber.formats.metadata.MetadataUtils.fromHeader(MetadataUtils.java:46)
at org.broadinstitute.hellbender.tools.copynumber.formats.collections.AbstractRecordCollection.(AbstractRecordCollection.java:82)
at org.broadinstitute.hellbender.tools.copynumber.formats.collections.AbstractLocatableCollection.(AbstractLocatableCollection.java:58)
at org.broadinstitute.hellbender.tools.copynumber.formats.collections.AbstractSampleLocatableCollection.(AbstractSampleLocatableCollection.java:44)
at org.broadinstitute.hellbender.tools.copynumber.formats.collections.SimpleCountCollection.(SimpleCountCollection.java:55)
at org.broadinstitute.hellbender.tools.copynumber.formats.collections.SimpleCountCollection.readTSV(SimpleCountCollection.java:74)
at org.broadinstitute.hellbender.tools.copynumber.formats.collections.SimpleCountCollection.read(SimpleCountCollection.java:68)
at org.broadinstitute.hellbender.tools.copynumber.CreateReadCountPanelOfNormals.runPipeline(CreateReadCountPanelOfNormals.java:268)
at org.broadinstitute.hellbender.engine.spark.SparkCommandLineProgram.doWork(SparkCommandLineProgram.java:31)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.runTool(CommandLineProgram.java:139)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMainPostParseArgs(CommandLineProgram.java:191)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:210)
at org.broadinstitute.hellbender.Main.runCommandLineProgram(Main.java:162)
at org.broadinstitute.hellbender.Main.mainEntry(Main.java:205)
at org.broadinstitute.hellbender.Main.main(Main.java:291)
19/05/14 12:00:07 INFO ShutdownHookManager: Shutdown hook called
19/05/14 12:00:07 INFO ShutdownHookManager: Deleting directory /rds-d3/user/jml96/hpc-work/CNV_GATK4/Temp_files/spark-dbec535d-cf38-40de-b9e6-dd733b8dc028

Best Answer

Answers

Sign In or Register to comment.