Latest Release: 05/01/19
Release Notes can be found here.

java.sql.SQLNonTransientConnectionException after finishing workflow

Hi,

I encountered a java.sql.SQLNonTransientConnectionException: connection exception: closed after my workflow was completed. I use GATK 4.0.4.0 and this never happened before. I didn't change anything .

[INFO] [10/05/2018 18:51:15.720] [cromwell-system-akka.dispatchers.engine-dispatcher-43] [akka://cromwell-system/user/SingleWorkflowRunnerActor/WorkflowManagerActor] WorkflowManagerActor WorkflowActor-e63bc8bc-2c53-4619-bbc8-3c687418a1ff is in a terminal state: WorkflowSucceededState
[INFO] [10/05/2018 18:51:35.764] [cromwell-system-akka.dispatchers.engine-dispatcher-43] [akka://cromwell-system/user/SingleWorkflowRunnerActor] SingleWorkflowRunnerActor workflow finished with status 'Succeeded'.
[ERROR] [10/05/2018 18:51:35.832] [cromwell-system-akka.dispatchers.engine-dispatcher-43] [akka://cromwell-system/user/SingleWorkflowRunnerActor] SingleWorkflowRunnerActor received Failure message: connection exception: closed
java.sql.SQLNonTransientConnectionException: connection exception: closed
    at org.hsqldb.jdbc.JDBCUtil.sqlException(Unknown Source)
    at org.hsqldb.jdbc.JDBCUtil.sqlException(Unknown Source)
    at org.hsqldb.jdbc.JDBCClobClient.getSubString(Unknown Source)
    at cromwell.database.sql.SqlConverters$ClobToRawString$.toRawString$extension(SqlConverters.scala:33)
    at cromwell.database.sql.SqlConverters$ClobOptionToRawString$.$anonfun$toRawStringOption$1(SqlConverters.scala:22)
    at scala.Option.map(Option.scala:146)
    at cromwell.database.sql.SqlConverters$ClobOptionToRawString$.toRawStringOption$extension(SqlConverters.scala:22)
    at cromwell.database.sql.SqlConverters$ClobOptionToRawString$.toRawString$extension(SqlConverters.scala:24)
    at cromwell.services.metadata.impl.MetadataDatabaseAccess.$anonfun$metadataToMetadataEvents$4(MetadataDatabaseAccess.scala:99)
    at scala.Option.map(Option.scala:146)
    at cromwell.services.metadata.impl.MetadataDatabaseAccess.$anonfun$metadataToMetadataEvents$1(MetadataDatabaseAccess.scala:98)
    at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:234)
    at scala.collection.Iterator.foreach(Iterator.scala:929)
    at scala.collection.Iterator.foreach$(Iterator.scala:929)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1417)
    at scala.collection.IterableLike.foreach(IterableLike.scala:71)
    at scala.collection.IterableLike.foreach$(IterableLike.scala:70)
    at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
    at scala.collection.TraversableLike.map(TraversableLike.scala:234)
    at scala.collection.TraversableLike.map$(TraversableLike.scala:227)
    at scala.collection.AbstractTraversable.map(Traversable.scala:104)
    at cromwell.services.metadata.impl.MetadataDatabaseAccess.metadataToMetadataEvents(MetadataDatabaseAccess.scala:90)
    at cromwell.services.metadata.impl.MetadataDatabaseAccess.$anonfun$queryWorkflowOutputs$1(MetadataDatabaseAccess.scala:142)
    at scala.util.Success.$anonfun$map$1(Try.scala:251)
    at scala.util.Success.map(Try.scala:209)
    at scala.concurrent.Future.$anonfun$map$1(Future.scala:289)
    at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:29)
    at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:29)
    at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:60)
    at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
    at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:91)
    at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
    at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:81)
    at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:91)
    at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
    at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:43)
    at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: org.hsqldb.HsqlException: connection exception: closed
    at org.hsqldb.error.Error.error(Unknown Source)
    at org.hsqldb.error.Error.error(Unknown Source)
    at org.hsqldb.Session.execute(Unknown Source)
    at org.hsqldb.types.ClobDataID.getChars(Unknown Source)
    at org.hsqldb.types.ClobDataID.getSubString(Unknown Source)
    ... 38 more

[INFO] [10/05/2018 18:51:35.842] [cromwell-system-akka.actor.default-dispatcher-33] [akka://cromwell-system/deadLetters] Message [cromwell.core.actor.StreamActorHelper$StreamFailed] without sender to Actor[akka://cromwell-system/deadLetters] was not delivered. [1] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.
[INFO] [10/05/2018 18:51:35.842] [cromwell-system-akka.actor.default-dispatcher-33] [akka://cromwell-system/deadLetters] Message [cromwell.core.actor.StreamActorHelper$StreamFailed] without sender to Actor[akka://cromwell-system/deadLetters] was not delivered. [2] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.
[INFO] [10/05/2018 18:51:35.842] [cromwell-system-akka.actor.default-dispatcher-33] [akka://cromwell-system/deadLetters] Message [cromwell.core.actor.StreamActorHelper$StreamFailed] without sender to Actor[akka://cromwell-system/deadLetters] was not delivered. [3] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.

Can someone help with that? Can I ignore it and use the data?

Best,
Daniel

Best Answer

Answers

Sign In or Register to comment.