On Monday and Tuesday, November 12-13, the communications team will be out of the office for a U.S. federal holiday and a team event. We will be back in action on November 14th and apologize for any inconvenience this may cause. Thank you for using the forum.

Reasons MarkDuplicates might not remove all duplicates?

I was using MarkDuplicates to remove duplicates from a BAM file with extremely high coverage. It was for the gene GAPDH, and I calculated the maximum possible number of bases using Ensembl BioMart to be 2,877 (by adding up the longest possible length of an exon based on all isoforms). From this I would expect to see a maximum of that many reads (since if the reads are moving along the gene in a sliding window, the most unique ones there should be would be the same as the number of bases). For some reason when I use MarkDuplicates it doesn't get close, it still has nearly 8,000 reads. Everything online says MarkDuplicates removes reads based on 5' position and strand, not sequence, so it shouldn't be affected by reads having slight variation, if they're mapped to the same position then it should be removing them right? I'm not sure if I'm using it wrong, or if there's a concept I'm not understanding so any help would be greatly appreciated.

This is my code:

    java -jar picard.jar MarkDuplicates \
       I="${SORTED_BAM_FILE}" \
       O="${UNIQUE_SORTED_BAM_FILE}" \
       M="${HASH_DIRECTORY}/marked_dup_metrics.txt" \
       REMOVE_DUPLICATES=TRUE \
       REMOVE_SEQUENCING_DUPLICATES=TRUE \
       PROGRAM_RECORD_ID=null \
       ASSUME_SORT_ORDER=coordinate   #File is sorted by coordinate earlier in the pipeline

Best Answer

Answers

  • shleeshlee CambridgeMember, Administrator, Broadie, Moderator admin
    RachelM42,

    Do you have paired end reads? If so, MarkDuplicates defines duplicates at the _insert_ level. 
  • Hi, thanks for your answer :) Could you expand a bit? I'm not entirely sure I understand what you mean by that. Is it that because it's considering the insert they are less likely to be the same?

Sign In or Register to comment.