Bug Bulletin: The recent 3.2 release fixes many issues. If you run into a problem, please try the latest version before posting a bug report, as your problem may already have been solved.

Possible bug in CombineVariants

johnwallace123johnwallace123 Posts: 11Member
edited February 19 in Ask the GATK team

I believe that I may have found an issue with the CombineVariants tool of GATK that manifests itself when there is a repeated ID in a given VCF. For us, the reason to have repeated IDs in a VCF file is to detect inconsistencies in our sample by calling variants on 2 different DNA samples and then checking the concordance. Our current process is:

1) Generate a VCF containing unique IDs (using GATK CallVariants)
2) Replace the VCF header with potentially non-unique IDs (using tabix -r)
3) Merge a single VCF to uniqify the IDs (using GATK CombineVariants)

It seems that the genotypes in the merged VCF are off by one column. I've attached 3 files that demonstrate the issue: "combined" which is the result of step 1, "combined.renamed", which is the output of step 2, and "combined.renamed.merged", which is the output of step 3.

The relevant lines are as follows:
combined:

HG00421@123910725 HG00422 HG00422@123910706 HG00423@123910701 NA12801 NA12802
0/0:300           0/0:127 0/0:292           0/0:290           0/0:127 0/0:127
0/0:299           0/0:127 0/0:299           0/0:293           0/0:127 0/0:127

combined.renamed:

HG00421 HG00422 HG00422 HG00423 NA12801 NA12802
0/0:300 0/0:127 0/0:292 0/0:290 0/0:127 0/0:127
0/0:299 0/0:127 0/0:299 0/0:293 0/0:127 0/0:127

combined.renamed.merged:

HG00421 HG00422 HG00423 NA12801 NA12802
0/0:300 0/0:127 0/0:292 0/0:290 0/0:127
0/0:299 0/0:127 0/0:299 0/0:293 0/0:127

Using the depth argument here, we can see that in the merged dataset, NA12801 has depths 290,293 whereas in the original and renamed datasets the depths were 127,127. The 290,293 depths correspond to HG00423, which is the column before.

I have confirmed this behavior in both GATK 2.7-4 and 2.8-1. If there's any more information that you need, please let me know, and I would be happy to provide it. Also, if you might know where this issue arises, I would be happy to try to provide a patch.

Thanks,

John Wallace

gz
gz
combined.vcf.gz
5K
gz
gz
combined.renamed.vcf.gz
5K
gz
gz
combined.renamed.merged.vcf.gz
5K
Post edited by Geraldine_VdAuwera on

Best Answer

Answers

  • ebanksebanks Posts: 679GATK Developer mod

    Or just rename them with a suffix: Foo.1 Foo.2 etc.

    Eric Banks, PhD -- Senior Group Leader, MPG Analysis, Broad Institute of Harvard and MIT

  • johnwallace123johnwallace123 Posts: 11Member

    I think throwing an error would alert users to the issue, but it would be nice if it was supported. Ideally, there would be some kind of "smart" merging of the 2 genotypes, but simply taking the first one seen would be just as valid.

    As for extracting the duplicates to separate files, that is somewhat difficult in our pipeline because there are anywhere from 1-6+ copies of the same person (same person, different DNA run), so we don't know a priori how many files to create. However, we have amended our scripts to optionally uniqify the samples before renaming, which avoids this issue.

    I have not seen this issue arise with unique sample IDs, so I think that the problem was between the chair and keyboard. Thanks so much for the help; you can file this as a feature request.

  • Geraldine_VdAuweraGeraldine_VdAuwera Posts: 5,988Administrator, GATK Developer admin

    John, I can see how it would be useful to you and it would be ok because you clearly know what you're doing, but in general it would be unsafe and could potentially cause severe problems down the road for naive users. Building in this option in a way that would be both smart and safe for the average user would simply require too much work, at a time when we really can't spare the resources. So I don't see this feature happening in the foreseeable future, sorry. Good luck with your work!

    Geraldine Van der Auwera, PhD

Sign In or Register to comment.