We've moved!
This site is now read-only. You can find our new documentation site and support forum for posting questions here.
Be sure to read our welcome blog!

A short 2bp indel not detected with GATK

This seems weird. There is a 2 bp deletion at a chromosome position which GATK didnt detect. When looked into IGV it was occuring at 6% allele frequency in the sample's bam. GATK unified genotyper (both version 1.6 and 2.8) did not call it. The same deletion was seen at approx. similar percentage in the same sample which was ran on the previous run but it had a quality of 11.0 annotated in the QUALITY column (I have -std_emit_conf set up to 10, std_call_conf (default: 30)) .
I thought that may be quality is playing role in calling or not calling this deletion. So I went ahead and decreased the quality cut off to 0 (-std_emit_conf 0), I kept std_call_conf to default, i.e. 30. I wanted to see those low quality variants from 0 to 30 to be annotated as "Low Quality". But unfortunately I did not get that deletion detected. So it might be the case that it has the quality annotated as "0"
My question is how is the quality of a variant (especially "deletion") calculated?
What is the strategy in general (lets say to calculate a quality of SNV) .Do you average out quality of that variant base in all the reads aligned at that position?
Then what is the strategy for calculating quality of a "deletion"?
Do you look for the quality of bases around deletion in all deletion containing reads aligned at that position ( in my case 6% of reads)?
OR do you get the idea of quality at that position by averaging out the quality of "bases" in the remaining 94% of reads aligned at that position?

May be it didnt call this deletion at -st_emit_call cutoff of 0 because the quality of that deletion was annotated as "0" itself . But the question is how do you guys decide on that quality value?

Answers

Sign In or Register to comment.