We've moved!
This site is now read-only. You can find our new documentation site and support forum for posting questions here.
Be sure to read our welcome blog!

Spark in other clusters?

datakiddatakid At my deskMember
edited September 2017 in Ask the GATK team

After experiencing some issues with GATK 3.2 on CentOS 7.4 (AVX changes broke everything), we've installed the newer versions - 3.8.

I thought I'd install 4 beta to test, but am confused by all this talk of SPARK.

We run a SLURM cluster. Do I need to install SPARK to run on top of that in order to get the advantages, or is SPARK not strictly necessary for running on a cluster?

That being the case - can we delete the gatk-package-4.beta.5-spark.jar?

(to my shame I found the answer in this tutorial after asking the question)

Sign In or Register to comment.