If you happen to see a question you know the answer to, please do chime in and help your fellow community members. We encourage our fourm members to be more involved, jump in and help out your fellow researchers with their questions. GATK forum is a community forum and helping each other with using GATK tools and research is the cornerstone of our success as a genomics research community.We appreciate your help!
Test-drive the GATK tools and Best Practices pipelines on Terra
Check out this blog post to learn how you can get started with GATK and try out the pipelines in preconfigured workspaces (with a user-friendly interface!) without having to install anything.
Split'N'Trim in handling huge bam file
Hi GATK team, i did encountered a problem in running split‘N'Trim on a BAM file (file size up to 27G)
It prompt the error message as follow :
## EEROR MESSAGE: An error occured when trying to write the BAM file. Usually this happen when there is not enough space in the directory to which data is being written (generally the temp directory) or when your system's open file handle limit is too small. To tell java yo use a bigger/better file system use -Djava.io.tmpdir=X on the command line, The exact error was java.io.FileNotFoundException: _tmp/sortingcollection.8357100622428529694.tmp (Too many open files)
I believed that the file size is too huge to be handled, and i've set the -Djava.io.tmpdir=Xmx4g, it gave the same error message as well. What would you recommend me to do to handle such a huge data?