We've moved!
You can find our new documentation site and support forum for posting questions here.

FireCloud Basics

Geraldine_VdAuweraGeraldine_VdAuwera Cambridge, MAMember, Administrator, Broadie admin
edited June 2017 in Archive

This document is in the process of being deprecated and replaced by individual articles about the topics it covers.

The FireCloud world is organized into workspaces.

A workspace is a computational sandbox where you can organize genomic data and tools, and run analyses. Users can create, share, and clone workspaces. Workspaces hold:

  • Data: pre-loaded or user-uploaded, open access or controlled access
  • Workflows: pre-loaded or user-created
  • Tools: pre-loaded or user-created
  • Results: from all runs, captured with provenance

These components and all relevant information, including using and creating workspaces, uploading and downloading data, configuring methods and applying tools in workflows, are detailed below.

FireCloud Basics

Table of Contents

Workspace Overview

A workspace is a computational sandbox where you can organize genomic data and tools, and run analyses. Users can create, share, and clone workspaces.

Workspaces hold

  • Data: pre-loaded or user-uploaded, open access or controlled access

  • Workflows: pre-loaded or user-created

  • Tools: pre-loaded or user-created

  • Results: from all runs, captured with provenance

Workspace Concepts

Workspaces contain a data model to organize data and metadata, and simplify analysis runs for large data sets. The data model includes predefined entity types (e.g., participant and sample set), relationships, and attributes. For your convenience, results from analyses are populated directly to the data model. Currently, the data model is tailored to TCGA data, but will be extensible to non-TCGA projects with a germline or cell-line focus.

The data model includes entities and entity attributes. Entities refer to a physical thing (e.g., a participant) or a collection of physical things (e.g., participant sets). FireCloud uses entities to provide organization and hierarchical structure for data. For example, a participant entity refers to a participant; a sample entity refers to a sample from that participant.

Meanwhile, entity attributes are used to describe entities and associate data to entities. An entity attribute can include values (e.g., numbers or strings) and file paths to data (e.g., the URL of a Google bucket). For example, a participant (entity) can have an age (entity attribute). A sample (entity) can also have an associated BAM that resides in a Google bucket.

Entity attributes can serve as inputs and outputs to methods. For example, a sample (entity) can reference a BAM file path (entity attribute) that serves as the input to a method. This method can in turn generate outputs that populate new entity attributes as results.

Google Buckets

Upon creation, a workspace generates a single Google bucket within Google Cloud Platform (GCP). FireCloud uses Google buckets to store the data generated in your workspace in the cloud. All storage costs are charged to FireCloud Billing Projects. Go here for more information about billing and projects.

In the example below, if you create a new workspace called "Broad_GTEx_RNASeq," the system creates a new bucket, e.g., fc-25ec6523-aad2-49e7-9b59-c89a737276c6 with a clickable link below Google Bucket.


Total Estimated Storage Fee per Month

FireCloud displays a Total Estimated Storage Fee per month for every bucket associated with a workspace. You can view this information in the Workspace Summary tab, below the Google bucket id.

To calculate the estimate, FireCloud applies the Google Cloud Storage (GCS) General Pricing mode (0.026 dollars/GB/month or 26 dollars/TB/month) to the total size of all files in your Google bucket. The estimate includes any files that you uploaded or copied directly to your bucket, as well as files that populated to your bucket from analysis submissions within your workspace. Example: your Google bucket includes a dozen files with a total size of 1.141 GB. The Total Estimated Storage Fee per month is $0.03.

Please note that FireCloud updates the estimate on a daily basis as it receives storage information from Google. If you add files to your Google bucket, please allow at least 24 hours for FireCloud to display the updated estimate.

Workspace Attributes

Workspaces attributes are globally accessible to all methods within a workspace. If you enter workspace attributes in the workspace Summary tab, they can serve as inputs for any method you run within your workspace.

Click Edit and Add new to add a new workspace attribute. In the example below, the Key markers_file refers to the name of the file and the Value gs://fc-44cb2981-5e5a-4ec0-bb0c-0a9ff5966c6f/markers_file.txt refers to the Google Bucket file path. After you click Save, any method configurations within this workspace can reference this file as an input to run an analysis.


You can also click Import Attributes and Download Attributes to import or download workspace attributes as a tab-separated-value (TSV) file.


When importing a new TSV file for workspace attributes, note the file format requirement:


The first column header only MUST begin with workspace:[Key]. Enter the Value in the row below each Key. To enter multiple workspace attributes, simply enter a new column with a Key row and Value row.


Workspaces also include method configurations that bind data to methods, containing tools. You can use a method config to specify which entity attributes to use as inputs to an analysis runs and for which entity attributes you want to populate results.

Accessing an Existing Workspace

After you log on to FireCloud, any workspaces for which you have access will display in the Workspaces List.


From here you can search for workspaces using the filter box. To enter a workspace, click on its name (e.g., my_workspace).

Workspace Navigation

The Summary tab describes basic workspace details such as its owner(s) and description.


To view the workspace data model and attributes, click on the Data tab.


You can view different entities (participant, sample, pair, pair sets) in your data model by clicking the buttons above.

Use the Method Configurations tab to view all method configurations for your workspace. For tutorial workspaces, method configurations may be pre-populated with methods to run analyses. You can also import method configurations from the Method Repository.


You can click on a single method configuration to view its settings.


Information about the method displays below the Method Configurations tab. Additionally, you can view methods you have access to by clicking the Method Repository link at the top of the page.

Each method has a set of inputs and outputs that define how the data model feeds data to methods and updates the data model from analysis results. Input parameters get their values from attributes of the pair entity that the method is executed on.

In the example below, the user entered the the "case_bam" attribute value for the “tumor_bam” input parameter. The method will execute on the “case_bam” attribute for this input parameter when the user launches an analysis.


The Monitor tab displays analyses that you run in FireCloud.


You can check the status of an analysis by clicking on it.


Creating a New Workspace

From the Workspaces view, click the Create New Workspace... button.


Upon creation, you will be prompted to enter information about your new workspace. If you provided a Google Billing Account during registration, you can select a Google Cloud Platform (GCP) Project to track compute and storage costs for this workspace. Refer to Projects and Billing Accounts for more information.

Workspace Access Controls

FireCloud workspace access controls (ACLs) contain three access levels: READER, WRITER, and OWNER where each access level represents an expanded set of permissions from the previous.

You can update workspace access controls from the workspace Summary tab.


If a workspace ACL grants a user READER access, the user can:

  • enter the workspace and view its contents

  • clone the workspace

  • copy data and method configs from that workspace to one in which the user has been granted WRITER or OWNER access

The user cannot:

  • make changes to the data model (add/delete entities, edit metadata)

  • add/delete method configs

  • edit method configs

  • launch an analysis (submit a method config for execution)

  • abort submissions


If a workspace ACL grants a user WRITER access, the user has all the permissions granted to a user with READER access, and in addition can:

  • can make changes to the data model (add/delete entities)

  • can create new collections (sample sets, individual sets, pair sets) from existing non-set entities (samples, participants, pairs)

  • can delete and edit entities

  • can add/modify entities, including the ability to

  • copy entities from another workspace’s data model into the workspace, provided user has at least READER access to the source workspace

  • upload data entities and their data files directly to workspace

  • can add/modify/delete method configs, including the ability to

  • copy method configs to the workspace from the method repository (provided user has read access to the method config)

  • copy method configs from another workspace provided user has at least READER access to the source workspace

  • can edit method configs within the workspace

OWNER Access

If a workspace ACL grants a user OWNER access, the user has all the permissions granted to a user with WRITER access, and in addition:

  • can edit the workspace’s ACL

  • can delete a workspace

When you create or clone a workspace, the new workspace’s ACL automatically grants you OWNER-level permissions.

Uploading and Downloading Data

In order to add data to your workspace, you need to upload your data files to the workspace bucket. You can upload data to your bucket through either the Google Developers Console or Google’s command line utility gsutil.

Google Developers Console

Users can also upload and download data through the Google Developers Console.

To access buckets, navigate to your Workspace Summary tab and click on the bucket URL, .e.g., fc-f498747a-b7d8-4d78-937e-26f0eb27cfa0.

Once you are in a bucket, you can click Upload Files. Or to download the file, click on its name, .e.g,. panel_100_genes.interval_list.



First, install gsutil to your local computer. The Google Cloud SDK installation includes gsutil. To install Google Cloud SDK

  1. You can run the following command using bash shells in your Terminal: curl https://sdk.cloud.google.com | bash Or download google-cloud-sdk.zip or google-cloud-sdk.tar.gz and unpack it. Note: The command is only supported in bash shells.

  2. Restart your shell: exec -l $SHELL or open a new bash shell in your Terminal.

  3. Run gcloud init to authenticate, set up a default configuration, and clone the project's Git repository.

Before uploading data using gsutil, you can list buckets you have access to by running gsutil ls, or gsutil ls -p [project name] to list buckets for a specific project.

To upload data to a bucket, run gsutil cp [local file path] [bucket URL]. You must have read/write access to the bucket.

The bucket URL is the path to your file in the Google Cloud SDK. It will look like gs://[bucket name], e.g. gs://jntest10052015 or for folders within a bucket, gs://[bucket name]/[folder name], e.g., gs://jntest10052015/gene_files.

To download data from a bucket, run gsutil cp [bucket URL]/[file name], e.g., gs://jntest10052015/HCC1143.100_gene_250bp_pad.bam.

Entity Attributes

FireCloud uses entity attributes to describe data entities (e.g., a participant identifier) and reference entity file locations (e.g., the URL to a Google Cloud Storage bucket). From the Data tab, you can click the Import Data… button to import new entity attributes or add to existing attributes within a workspace.

You can import entity attributes by clicking Import from file or Copy from another workspace. Note that copying from another workspace will not import the data into your workspace bucket. Rather, it will refer to file paths in the bucket of the workspace you copied. Thus, if that workspace bucket is deleted, your workspace data model will no longer refer to an existing bucket path.


Data Model

This section describes format requirements for load files and how FireCloud translates files into data model entities and relationships.

FireCloud Load File Format

Data can be imported to FireCloud as tab-separated-value (TSV) files (e.g., a .txt file) where each line in the file corresponds to an entity. All of the lines in a load file must reference entities of the same type and separate files must be used for each entity type.

The FireCloud data model supports the following entity types:

  • Participant

  • Sample

  • Pair

  • Participant Set

  • Sample Set

  • Pair Set

The first line for TSV files must contain the appropriate field names in their respective column headers.

Below are load file entity types and their corresponding first-column headers.

Load File Entity Type first-column header
Participant entity:participant_id
Sample entity:sample_id
Pair entity:pair_id
Participant Set entity:participant_set_id
Sample Set entity: sample_set_id
Pair Set entity:pair_set_id

Non-Set Entity Load Files


  • first column: entity:participant_id (key)

  • subsequent columns: entity attributes

  • exactly one row per entity in the file

entity:participant_id disease gender age clinical_xml
TCGA-5M-AAT4 COAD M 53 gs://TCGA/tcga-5m-aat4.xml
TCGA-NH-A8F8 COAD M 67 gs://TCGA/tcga-nh-a8f8.xml
TCGA-5M-AAT5 COAD F 50 gs://TCGA/tcga-5m-aat5.xml

Example of Participants Load File


  • first column: entity:sample_id (key)

  • subsequent columns (no ordering requirement):

  • participant_id (foreign key)

  • entity attributes

  • exactly one row per entity in a file

entity:sample_id participant_id sample_type
TCGA-5M-AAT4-01A TCGA-5M-AAT4 primary_solid_tumor
TCGA-5M-AAT4-10A TCGA-5M-AAT4 blood_derived_normal
TCGA-NH-A8F8-01A TCGA-NH-A8F8 primary_solid_tumor
TCGA-NH-A8F8-10A TCGA-NH-A8F8 blood-derived_normal

Example of Samples Load File


  • first column: entity:pair_id (key)

  • subsequent columns (no ordering requirement):

  • case_sample_id (foreign key)

  • control_sample_id (foreign key)

  • participant_id (foreign key)

  • entity attributes (optional)

  • exactly one row per entity in a file

entity:pair_id case_sample_id control_sample_id particpant_id

Example of Pairs Load File

Set Entity Load Files

FireCloud uses Membership load files to specify set entity membership and Update load files to specify set entity attributes. Update load files are ONLY necessary if you want to specify attributes for a set.

In Membership load files, each line lists the membership of a non-set entity (e.g., participant) in a set (e.g., participant set). The first column contains the identifier of the set entity and the second column contains a key referencing a member of that set.

membership:participant_set_id participant_id

Example of Participant Set Membership Load File

Note: Multiple rows in the Membership load file may have the same set entity id (e.g., TCGA_BRCA).

Meanwhile, Update load files specify set entity attributes. The first column contains the set entity identifier and subsequent columns contain entity attributes, e.g., a gistic2_input file.

In Update load files, the set entity referenced in the first column must already exist in the workspace data model.

update:participant_set_id gistic2_input
TCGA_COAD gs://fc-e6e9f4dc-28e8.../gisticInputs.txt
TCGA_BRCA gs://fc-e6e9f4dc-28e8-.../gisticInputs.txt

Example of Participant Set Update Load File

Note: Multiple rows for the same set entity are NOT permitted in Update load files. To add additional attributes for the same set entity id, you should use additional columns.

Generating Load Files

The FireCloud load file format permits users to copy and paste data from Excel into their TSV file of choice. When you copy data from Excel and paste it into Text Editor, Text Editor will add tabs between columns.

Order for Uploading Load Files

Load files must be imported in a strict order due to references to other entities. You can click the Import Data… button in order to browse for files to import:


The order is as follows ("A > B" means entity type A must precede B in order of upload):

  • participants > samples

  • samples > pairs

  • participants > participant sets

  • samples > sample sets

  • pairs > pair sets

  • set membership > set entity, e.g., participants > samples > sample set membership > sample set entity.

After you upload a load file successfully, a confirmation message appears.


If the import failed, a failure message specifies what went wrong with the import data.

Overwriting and Deleting Entity Attributes

If an entity attribute already exists, you can import or copy load files to overwrite its values. For example, you may have previously imported a participant load file that had several columns for entity attributes. If you import another participant load file, it will overwrite values in all columns that existed in the previous load file, and create new entries for any new columns.

You can also enter DELETE in a load file to remove an entity attribute value. For example,

entity:participant_id disease gender


A method can correspond to a single task or a workflow. Tasks are executable programs (e.g., a Tool) bundled into a Docker image. Workflows, comprised of one or more tasks, contain the method and the method input parameters. FireCloud submits both tasks and workflows to Google Job Execution System (JES) when they are ready to run (i.e., inputs are available, which may be the outputs of upstream tasks).

An Analysis is what FireCloud submits to JES when launching a method configuration against an entity or entity set. It is a combination of a method config and entity or entity set identifying the

  • method that runs;

  • number of times the method runs; and

  • inputs and outputs for each run.

Inputs and Outputs are mapped to attributes on data model entities. In response to the user submission, JES launches a workflow for each run of the method.

FireCloud uses WDL (Workflow Description Language), a domain specific language, to describe tasks and workflows.

In the Method Repository, a method is described by a WDL file that references tasks such as Docker images. The FireCloud Method Repository does not store the Docker images; rather these are stored on Docker Hub.

FireCloud identifies methods through the following unique identifiers:

  • a namespace;

  • a method name; and

  • a snapshotID.

When a method is added to the method repository, it automatically displays a new snapshotID. This ensures methods in the method repository are never overwritten and that provenance is fully captured.

For example, if you add the tool myAligner to the method repository under the namespace myNamespace, it’s identifier in the method repository might be: myNamespace/myAligner/1

Method Repository

The method repository contains namespaces, methods, and method configs.

Method Repository Namespaces

The namespace is a "folder" of methods and method configs. The method repository has namespaces under which both methods and method configs are stored. You can name your namespaces however you see fit, provided the chosen namespace does not already exist. Administrators may verify namespaces, which like verified twitter accounts, establish the authenticity of namespaces attached to specific organizations (e.g., Broad Institute).

Uploading Methods through the UI

You can upload your method (WDL) directly through the FireCloud UI. In the Method Repository, click on Create new method.... Then, copy/paste your WDL into the WDL text block or click Load from file... to load a WDL file from your computer.

Namespace: the "folder" where you want to store this method

Name: the desired name of this method

Type: FireCloud supports both single task WDLs and workflows that comprise multiple tasks

Synopsis (optional): a description of this method

Documentation (optional): any markdown text you want to add to describe this method


First, you must create a text file on your local filesystem that contains the WDL for the method.

Below is an example of a WDL file.

task M2 {

File ref_fasta

File ref_fasta_dict

File ref_fasta_fai

File tumor_bam

File tumor_bai

File normal_bam

File normal_bai

File intervals

String m2_output_vcf_name

command {

java -jar /task/GenomeAnalysisTK_latest_unstable.jar -T M2 \

--no_cmdline_in_header -dt NONE -ip 50 \

-R ${ref_fasta} \

-I:tumor ${tumor_bam} \

-I:normal ${normal_bam} \

-L ${intervals} \

-o ${m2_output_vcf_name} \



runtime {

docker: "gcr.io/broad-dsde-staging/cep3"

memory: "12 GB"

defaultDisks: "local-disk 100 SSD"


output {

File m2_output_vcf = "${m2_output_vcf_name}"



workflow CancerExomePipeline_v2 {

call M2


Pushing Methods through the CLI

The FireCloud Command Line Interface (CLI) provides another option to upload methods (WDL) to the Method Repository.

First, enter cd firecloud-cli in your Terminal to change directories.

Then, enter the FireCloud CLI push command. The box below shows an example of a command you use to push a method.

firecloud -u https://api.firecloud.org/api -m push -t Workflow -s broad-firecloud-jn-test -n test-method -y 'Test synopsis' file.wdl

You will see a "Successfully pushed" message indicating your method uploaded to FireCloud. You can also confirm your Method pushed by viewing the Method Repository. If your push failed, you will see an error message. To troubleshoot, please share the error message in the Forum.

Note: This example assumes that you downloaded the FireCloud Command Line Interface. You can also run a Docker version without downloading to your local environment.

Description of CLI Commands

  • Enter -u, then state the URL of the FireCloud environment.

  • Enter -m, then state the command (push).

  • Enter -t, then state Workflow as the type for pushing your Method.

  • Enter -s, then state the Method Namespace.

  • Enter -n, then state the Method name.

  • Enter -y, then describe your Method

  • Finally, enter the file name for your WDL.

Method Permissions

In order to share a method, you must set its ACL. Each method in the method repository has an ACL attached to it. When a user is granted READER permission, the user:

  • will see the method listed when viewing the contents of the method repository.

  • can execute the method (i.e., run analyses that call for the running of the method)

When a user is granted OWNER permission, the user:

  • has READER permission

  • can edit the method’s ACL

  • can redact a method

  • When a user uploads a method to the method repository, she is by default given OWNER permission to method.

Methods that are set to Publicly Readable will be shared with all FireCloud users.

Method Configurations

Method Configurations bind inputs and outputs to a root entity and its attributes in a data model. The Method Configuration can specify attributes to update upon method completion or output files to generate from the method.

Inputs and Outputs

Inputs and outputs tell the method configuration how to associate with entities and entity attributes. When writing an expression, this refers to the root entity that is selected when configuring a method. To write an expression that binds to attributes associated with the root entity, use the this.<attribute_name> syntax. For example, to access a reference FASTA associated with the root entity, use this.ref_fasta, where ref_fasta is the attribute name associated with your FASTA file pointer.

Example Entity Attributes: control_bai, case_bai, output_vcf, ref_pac, ref_ann, case_bam, case_sample, participant, vcf_output_name, ref_fasta, ref_amb, ref_intervals, control_sample, ref_sa, ref_bwt, control_bam, ref_fai,ref_dict

Input expressions may also dereference intermediate entities from the root entity. For example, to dereference an attribute called ref_fasta on the entity intermediate_entity, use the expression this.intermediate_entity.ref_fasta. Output expressions do not have the ability to dereference intermediate entities.

To map the ref_fasta attribute of HCC1143_pair to the input ref_fasta in the CancerExomePipeline_v2, use an Input Name of CancerExomePipeline.M2.ref_fasta and an Input Value of this.ref_fasta.

Input Name Input Value
CancerExomePipeline.M2.ref_fasta_dict this.ref_dict
CancerExomePipeline.M2.ref_fasta_fai this.ref_fasta
CancerExomePipeline.M2.tumor_bam this.case_bam
CancerExomePipeline.M2.tumor_bai this.case_bai
Input Name Input Value
CancerExomePipeline.M2.normal_bam this.control_bam
CancerExomePipeline.M2.normal_bai this.control_bai
CancerExomePipeline.M2.intervals this.ref_intervals
CancerExomePipeline.M2.m2_output_vcf_name this.vcf_output_name
Output Name Output Value
CancerExomePipeline.M2.m2_output_vcf this.output_vcf

When writing an expression, you can also type workspace.<attribute_name> to refer to Workspace Attributes you entered in the workspace Summary tab. For example, if you entered a workspace attribute called markers_file, your input can reference this file if you type workspace.markers_file.

Workflow Expansion Expressions

Workflow expansion expressions can be used to expand a workflow whose root entity type is a single entity (participant,sample,and pair) to run on multiple entities (pair_set, participant_set, and sample_set). To launch a workflow on each pair in a pair_set, you would toggle to pair_set in the Launch Analysis window and enter this.pairs in the Define Expression field.


Default Method Configurations

The method repository will contain "default" method configurations for each of the methods it holds. If you want to run a particular method in your workspace, copy the method’s default configuration in the Method Repository to your workspace.

Method Configurations can be created within a workspace. Users may publish a workspace’s method configuration to the method repository. Each method configuration in the method repository has an ACL.

Users granted READER permission:

  • will see the method configuration listed when viewing the contents of the method repository

  • can copy the method configuration from the method repository to a workspace they have COLLABORATOR or OWNER permissions for.

Users granted OWNER permission:

  • have READER permission

  • can edit the method configuration’s ACL

  • can redact the method configuration

  • OWNER permission can only be granted to individuals, it cannot be granted to a group

When a user publishes a method config (copy from workspace to the method repository), that user is granted OWNER permission for the method configuration.

Importing Method Configurations into a Workspace

You can import Method configurations from the methods repository by clicking the Import Configuration… button within the Method Configurations tab.


The import configuration dialog will show all methods you have access to within the repository.


You can click on an individual method configuration to get details and import it into your workspace. The imported method configuration can have a new name and namespace within your workspace to allow the same configuration to be copied.


Configuring a Method and Launching an Analysis

Each method configuration has a set of inputs and outputs that must be configured in order to run an analysis.


Each of these inputs has two parts: a name of the parameter as shown in a gray box, followed by the expression used to convert data from the data model into input values.


The expression on the right specifies which attributes in the data model map to this given input parameter. Every expression starts with the word "this", which corresponds to a single entity.

Each method configuration has a root entity type that defines the entity on which this configuration is intended to run. Input expressions are of the form this.attribute, this.entity.attribute, this.entity1.entity2.attribute, etc.

After you click on Method Configuration, you can click Launch Analysis… to start an analysis.


When launching the method, choose the entity type to run this method.


If the method runs on a sample, you can choose the sample entity type from the dropdown and any samples that appear in the table. You can also choose an entity type different from the intended entity type in the method configuration.

For instance, if a method runs against a sample and you want to run against the control sample within a pair, you could choose the given pair and write an expression to get the control sample.

Within the method configuration, "this" then refers to the pair’s control sample. Similarly, if the method ran on a sample and you selected a sample set, “this” refers to each sample in the sample set. In this case, one workflow would be created for each entity in the set and submitted all at once.

Each method also has a set of outputs that must be configured to populate analysis results.


These output expressions are the name of an attribute on the entity on which the method was run. For instance, if the method runs on a sample entity with value "output_vcf", the given output from the method will be set to an attribute on an entity called “output_vcf”. If this attribute does not exist on the entity, it will be added. If it already exists, it will be replaced.

To launch an analysis, click the Launch button at the bottom of the dialog.


After an analysis is launched, the Monitor tab appears to show the status of the analysis.

You can check the status of an analysis by clicking on it.


Each analysis will contain 0 or more workflows as shown in the Workflows section.

Statuses in FireCloud

FireCloud displays statuses for workspaces and analysis submissions at varying levels of granularity. This section describes where to find statuses in FireCloud and what each status means at each level.

For information about the high level status of FireCloud services, go to status.firecloud.org/.

Workspace Summary Tab

You can view the status of any workspace from the workspace Summary tab. The screenshot below displays the "Complete" status in the workspace Summary tab.


Workspace Statuses

There are three color-coded statuses for workspaces:

Running: the workspace has running analysis submissions.

Exception: the workspace has any failed analysis submission AND the last failed analysis submission is more recent than the last successful analysis submission.

Complete: there are no currently running analysis submissions AND the most recent analysis submission completed successfully.

Note: You can also view color-coded workspace statuses from the Workspaces List.


Monitor Tab Statuses

If you click on any workspace, you can view the status of its analysis submissions through the Monitor tab. The Monitor tab displays statuses at three levels of granularity:

  • Analysis submission

  • Workflow submission

  • Call/Task submission

Analysis Submissions

Analysis submissions display in a table when you first click on the Monitor tab. Each analysis submission displays below the Date column (e.g., May 18, 2016 9:28 PM). The status of each analysis submission displays below the Status column (e.g., Done).


By clicking on an analysis submissions in this table (e.g., May 18, 2016 9:28 PM), you can view the status of individual workflows in a separate screen.

Analysis Submissions Statuses

  • Submitted - All workflows have been submitted to the Cromwell workflow execution service and have an associated workflow id.

  • Aborting - The request to abort was received and every workflow in the submission that is currently running is being aborted.

  • Aborted - Every workflow is no longer in a running state, in response to a request to abort.

  • Done - Every workflow is no longer in a running state, which can be either indicate failure or success per workflow.

  • Done! - The submission status is "Done" AND the submission has any unstarted workflows OR any finished workflow is not in status "Succeeded".

Workflow Submissions

Workflow submission statuses display in a new screen after you click on an analysis submission. This screen displays the high level status of the analysis submission (e.g., Done) at the top of the page and workflow statuses in the Workflows table.


In the Workflows table, workflow submissions displays below the Data Entity column (e.g., HCC1143_WE_pair). Note that "HCC1143_WE_pair" refers to the entity on which the workflow submission was run. For each workflow submission, the status displays below the Status column (e.g., Succeeded).

Workflow Submission Statuses

  • Submitted - The workflow has been received by Cromwell and an associated workflow id has been generated.

  • Running - The workflow is being actively processed by Cromwell.

  • Aborting - Crowell has processed the request to abort this workflow but the abort is not complete.

  • Aborted - The workflow has halted completely due to an abort request.

  • Succeeded - The workflow has successfully completed.

  • Failed - The workflow has terminated abnormally due to some error.

  • Launching - The recently launched workflow is being sent to Cromwell.

  • Queued - Each user is permitted 1000 active (either running or aborting) workflows in FireCloud. Each workflow is initially queued, and is started assuming the user has not exceeded this limit.

By clicking on a workflow submission (e.g., HCC1143_WE_pair), you can view more information about the Started and Ended times, as well as the status for any calls within that workflow. This information will appear in a new screen as shown below. Note that every call represents a task within a workflow.


Call/Task Submissions

In this screen, you can view the status for any call (task) within a workflow by clicking Show. For example, if you click on a call (e.g., ContEstMuTectOncotatorWorkflow.OncotatorTask), you can view the Call ID, Call Status (e.g., Done) and Started and Ended times.


Call/Task Submission Statuses

  • NotStarted - The call/task has not started processing yet.

  • Starting - The call/task has been identified to start processing but is currently initializing internally.

  • Running - The call/task has been sent to Google JES for processing.

  • Failed - The call/task exited abnormally due to an unrecoverable error.

  • Done - The call/task successfully completed.

  • Aborted - The call/task was halted due to a workflow abort request.

Post edited by Geraldine_VdAuwera on


Sign In or Register to comment.