Google Bucket Directories as task input

timdeftimdef Cambridge, MAMember, Broadie
edited February 2016 in Ask the FireCloud Team

I suspect the answer is "no" or "not yet" but, is there a way to pass an entire directory as a single input to a WDL task?

In Firehose, you could pass a directory as a string input and because the jobs ran on the same filesystem the code could figure out that the string corresponded to a directory, and you could load files that were in there. A common use case was to have source code libraries in here, and R scripts and such could find any external functions they needed. This is nice because I don't have to specify each file in the folder explicitly (there might be a variable number/filenames).

Something like this might be a nice to have, and would save some time rewriting existing Firehose code:

task my_task {
File my_input_file
Directory my_input_dir
command {
./my_script --lib-dir ${my_input_dir} ${my_input_file}

Then the backend just needs to make sure the entire directory contents are copied into the image (either from a google bucket or the local filesystem).


  • danielrdanielr Broad InstituteMember, Broadie

    I have been running into a similar issue. I would like to pass a task multiple reference files stored in the workspace google bucket without specifying each as their own parameter, but rather, simply pass the entire directory as a parameter to the WDL.

  • jneffjneff BostonMember, Broadie, Moderator admin

    Hi Tim and Daniel:

    I'm reviewing this as a possible enhancement with the FireCloud Support Team. We will keep you posted.


Sign In or Register to comment.