Heads up:
We’re moving the GATK website, docs and forum to a new platform. Read the full story and breakdown of key changes on this blog.
Notice:
If you happen to see a question you know the answer to, please do chime in and help your fellow community members. We encourage our fourm members to be more involved, jump in and help out your fellow researchers with their questions. GATK forum is a community forum and helping each other with using GATK tools and research is the cornerstone of our success as a genomics research community.We appreciate your help!

Test-drive the GATK tools and Best Practices pipelines on Terra


Check out this blog post to learn how you can get started with GATK and try out the pipelines in preconfigured workspaces (with a user-friendly interface!) without having to install anything.
Attention:
We will be out of the office for a Broad Institute event from Dec 10th to Dec 11th 2019. We will be back to monitor the GATK forum on Dec 12th 2019. In the meantime we encourage you to help out other community members with their queries.
Thank you for your patience!

memory used by DepthOfCoverage

Hi,

I tried to use DepthOfCoverage to figure out the coverage of 50 whole exome sequencing data. It works fine for single sample, however, the program always complained about memory issue, even I provide 100 GB as "-Xmx100g". Any suggestions for the problem? There is a option "--read_buffer_size", which seems to be helpful, how should set the value ?

Tagged:

Best Answer

Answers

  • chunxuanchunxuan Member
    edited March 2014

    Some updates:

    The error output is

    OpenJDK 64-Bit Server VM warning: INFO: os::commit_memory(0x00007ff011600000, 2365587456, 0) failed; error='Cannot allocate memory' (errno=12)

    There is insufficient memory for the Java Runtime Environment to continue.
    Native memory allocation (malloc) failed to allocate 2365587456 bytes for committing reserved memory.

    java -version

    java version "1.6.0_26"

    Java(TM) SE Runtime Environment (build 1.6.0_26-b03)

    Java HotSpot(TM) 64-Bit Server VM (build 20.1-b02, mixed mode)

    Does anyone have similar problem?

  • chunxuanchunxuan Member

    @Geraldine_VdAuwera said:
    Hi chunxuan,

    DepthOfCoverage is very greedy for memory so running on 50 samples may be an issue, especially if you have areas of very high depth. Unfortunately there's nothing much we can do about that; the recommended way to deal with this is to run DoC separately on each individual sample and then analyze the results jointly using your preferred statistical package.

    A little bit update, do not assign too much memory by -Xmx, some times it will cause the memory issue.

Sign In or Register to comment.