To celebrate the release of GATK 4.0, we are giving away free credits for running the GATK4 Best Practices pipelines in FireCloud, our secure online analysis portal. It’s first come first serve, so sign up now to claim your free credits worth $250. Sponsored by Google Cloud. Learn more at https://software.broadinstitute.org/firecloud/documentation/freecredits

picard markdup error:Value was put into PairInfoMap more than once

Hi, all!

I get the error information Value was put into PairInfoMap more than once when I use picard to mark duplication.
I have already tested the newest version bwa 0.7.16a and picard 2.10.7.
My mapping paramater is bwa mem -M.
All paired reads id are unique in fq.

I use samtools view this.bam|grep readsname, the output is captured as this picture:
https://p.qlogo.cn/qqmail_head/LIND77SSexibQw48mEewIK3YKyoKCpB06NW5gSKticJqN2mMnvbd8S7KZdvcuWGj31sIGFzvpM9Bg/0
I have read the pages about this trouble asked before on gatk forum and biostar forum. I think the secondary hits in bam are correct, compared to the previous reported condition, but not for sure. Therefore, I come here to ask for professional help.

Comments

  • SheilaSheila Broad InstituteMember, Broadie, Moderator

    @Dicor
    Hi,

    Sorry for the delay. Have a look at this thread.

    This error usually means a read name occurs twice in your BAM file. Did you by any chance merge two FASTQ files before aligning? Make sure your Read Groups are properly specified. Have a look at this dictionary entry for more help on read groups.

    -Sheila

  • DicorDicor chinaMember

    @Sheila said:
    @Dicor
    Hi,

    Sorry for the delay. Have a look at this thread.

    This error usually means a read name occurs twice in your BAM file. Did you by any chance merge two FASTQ files before aligning? Make sure your Read Groups are properly specified. Have a look at this dictionary entry for more help on read groups.

    -Sheila

    @Sheila
    Thank you for your reply.
    I have already mentioned that all reads ids are unique in my fq files.
    Could you please have a look at the picture?

    I think the problem is both fragments have second hit on ref and are marked. But picard can't recognize them.

  • SkyWarriorSkyWarrior TurkeyMember

    Can you filter your bam file with samtools view -f 0x2 and try marking duplicates again with picard.

    This solved my problem many times with the same error.

Sign In or Register to comment.