We've moved!
This site is now read-only. You can find our new documentation site and support forum for posting questions here.
Be sure to read our welcome blog!

Does GATK4 open the FPGA port in BWA for accelerating?

We are working on the FPGA accelerating in BWA, does GATK4 open the relative port? THX.

Issue · Github
by Sheila

Issue Number
2635
State
closed
Last Updated
Assignee
Array
Milestone
Array
Closed By
chandrans

Answers

  • SheilaSheila Broad InstituteMember, Broadie admin

    @GraceZou
    Hi,

    I will check with the team and get back to you.

    -Sheila

  • GraceZouGraceZou ChinaMember

    @Sheila Do you have an answer ?THX.

  • SheilaSheila Broad InstituteMember, Broadie admin

    @GraceZou
    Hi,

    Currently, FPGA PairHMM support is available in HaplotypeCaller and MuTect2. FPGA support for BWA is not included.

    -Sheila

  • GraceZouGraceZou ChinaMember

    @Sheila ok, I got it. Do you support the "ADAM" file format? I find the source code in the GATK4.beta5,but I can't find any produce can use this kind of input file.

    /**
    * Loads ADAM reads stored as Parquet.
    * @param inputPath path to the Parquet data
    * @return RDD of (ADAM-backed) GATKReads from the file.
    */
    public JavaRDD getADAMReads(final String inputPath, final TraversalParameters traversalParameters, final SAMFileHeader header) throws IOException {
    Job job = Job.getInstance(ctx.hadoopConfiguration());
    AvroParquetInputFormat.setAvroReadSchema(job, AlignmentRecord.getClassSchema());
    Broadcast bHeader;
    if (header == null) {
    bHeader= ctx.broadcast(null);
    } else {
    bHeader = ctx.broadcast(header);
    }
    @SuppressWarnings("unchecked")
    JavaRDD recordsRdd = ctx.newAPIHadoopFile(
    inputPath, AvroParquetInputFormat.class, Void.class, AlignmentRecord.class, job.getConfiguration())
    .values();
    JavaRDD readsRdd = recordsRdd.map(record -> new BDGAlignmentRecordToGATKReadAdapter(record, bHeader.getValue()));
    JavaRDD filteredRdd = readsRdd.filter(record -> samRecordOverlaps(record.convertToSAMRecord(header), traversalParameters));
    return putPairsInSamePartition(header, filteredRdd);
    }

    /**
     * Loads the header using Hadoop-BAM.
     * @param filePath path to the bam.
     * @param referencePath Reference path or null if not available. Reference is required for CRAM files.
     * @return the header for the bam.
     */
    
  • SheilaSheila Broad InstituteMember, Broadie admin

    @GraceZou
    Hi,

    Have a look at this thread :smile:

    -Sheila

Sign In or Register to comment.