Heads up:
We’re moving the GATK website, docs and forum to a new platform. Read the full story and breakdown of key changes on this blog.
Update: July 26, 2019
This section of the forum is now closed; we are working on a new support model for WDL that we will share here shortly. For Cromwell-specific issues, see the Cromwell docs and post questions on Github.

Using AWS backend with EFS

Some questions from a colleague who is attempting to run the workflow on AWS:

How do I use EFS instead of S3 when I run Cromwell in AWS? I tried specifying this as one of the filesystems both within backend and engine constructs in addition to S3. But looks like EFS gets ignored. If I just specify EFS, Cromwell does not start, errors out looking for S3.

This is my requirement: My input files to the workflow are always in EFS. I would like Cromwell to link to that instead of copying from s3 to local EBS. How do I achieve this? (Note: we realize EFS is much slower and not ideal, but unfortunately the complexity of our inputs makes S3 impossible to use right now.)

Also, the EFS is mounted on my AWS batch computes. How do I specify the mount point to the container(I am asking this because, I don’t have control over creating AWS job definitions)?


Sign In or Register to comment.