Something about Create PoN workflow

While running 'CreatePanelOfNormals' , some errors occured. Here is the code i ran 'java -Xmx16g -Djava.library.path=/Workspace/Software/HDFView/HDFView-2.13.0-Linux/HDFView/2.13.0/lib/ -jar /Workspace/Software/gatk4-latest/gatk-protected.jar reatePanelOfNormals -I merge.txt -O PoN.tsv '. And the log file as following :

[January 9, 2017 6:11:58 PM CST] org.broadinstitute.hellbender.tools.exome.CreatePanelOfNormals --input merge.txt --output PoN.tsv --minimumTargetFactorPercentileThreshold 25.0 --maximumColumnZerosPercentage 2.0 --maximumTargetZerosPercentage 5.0 --extremeColumnMedianCountPercentileThreshold 2.5 --truncatePercentileThreshold 0.1 --numberOfEigenSamples auto --noQC false --dryRun false --disableSpark false --sparkMaster local[*] --help false --version false --verbosity INFO --QUIET false
[January 9, 2017 6:11:58 PM CST] Executing as [email protected] on Linux 3.10.0-514.2.2.el7.x86_64 amd64; OpenJDK 64-Bit Server VM 1.8.0_111-b15; Version: Version:version-unknown-SNAPSHOT
18:11:58.749 INFO CreatePanelOfNormals - Defaults.BUFFER_SIZE : 131072
18:11:58.750 INFO CreatePanelOfNormals - Defaults.COMPRESSION_LEVEL : 5
18:11:58.750 INFO CreatePanelOfNormals - Defaults.CREATE_INDEX : false
18:11:58.750 INFO CreatePanelOfNormals - Defaults.CREATE_MD5 : false
18:11:58.750 INFO CreatePanelOfNormals - Defaults.CUSTOM_READER_FACTORY :
18:11:58.750 INFO CreatePanelOfNormals - Defaults.EBI_REFERENCE_SEVICE_URL_MASK : http://www.ebi.ac.uk/ena/cram/md5/%s
18:11:58.750 INFO CreatePanelOfNormals - Defaults.INTEL_DEFLATER_SHARED_LIBRARY_PATH : null
18:11:58.750 INFO CreatePanelOfNormals - Defaults.NON_ZERO_BUFFER_SIZE : 131072
18:11:58.751 INFO CreatePanelOfNormals - Defaults.REFERENCE_FASTA : null
18:11:58.751 INFO CreatePanelOfNormals - Defaults.TRY_USE_INTEL_DEFLATER : true
18:11:58.751 INFO CreatePanelOfNormals - Defaults.USE_ASYNC_IO : false
18:11:58.751 INFO CreatePanelOfNormals - Defaults.USE_ASYNC_IO_FOR_SAMTOOLS : false
18:11:58.751 INFO CreatePanelOfNormals - Defaults.USE_ASYNC_IO_FOR_TRIBBLE : false
18:11:58.751 INFO CreatePanelOfNormals - Defaults.USE_CRAM_REF_DOWNLOAD : false
18:11:58.752 INFO CreatePanelOfNormals - Deflater JdkDeflater
18:11:58.752 INFO CreatePanelOfNormals - Initializing engine
18:11:58.753 INFO CreatePanelOfNormals - Done initializing engine
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/01/09 18:11:59 INFO SparkContext: Running Spark version 1.5.0
17/01/09 18:11:59 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/01/09 18:11:59 INFO SecurityManager: Changing view acls to: yangjiatao
17/01/09 18:11:59 INFO SecurityManager: Changing modify acls to: yangjiatao
17/01/09 18:11:59 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yangjiatao); users with modify permissions: Set(yangjiatao)
17/01/09 18:11:59 INFO Slf4jLogger: Slf4jLogger started
17/01/09 18:11:59 INFO Remoting: Starting remoting
17/01/09 18:11:59 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:46229]
17/01/09 18:11:59 INFO Utils: Successfully started service 'sparkDriver' on port 46229.
17/01/09 18:11:59 INFO SparkEnv: Registering MapOutputTracker
17/01/09 18:11:59 INFO SparkEnv: Registering BlockManagerMaster
17/01/09 18:12:00 INFO DiskBlockManager: Created local directory at /tmp/yangjiatao/blockmgr-60a33e19-35f0-4969-8ba4-eb844485d298
17/01/09 18:12:00 INFO MemoryStore: MemoryStore started with capacity 7.7 GB
17/01/09 18:12:00 INFO HttpFileServer: HTTP File server directory is /tmp/yangjiatao/spark-9dd82bfd-6256-43a9-a5c0-b8b831b0bc21/httpd-927e1211-e7c3-42e8-b528-786f806ed19b
17/01/09 18:12:00 INFO HttpServer: Starting HTTP Server
17/01/09 18:12:00 INFO Utils: Successfully started service 'HTTP file server' on port 35578.
17/01/09 18:12:00 INFO SparkEnv: Registering OutputCommitCoordinator
17/01/09 18:12:00 INFO Utils: Successfully started service 'SparkUI' on port 4040.
17/01/09 18:12:00 INFO SparkUI: Started SparkUI at http://192.168.0.131:4040
17/01/09 18:12:00 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
17/01/09 18:12:00 INFO Executor: Starting executor ID driver on host localhost
17/01/09 18:12:00 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 37751.
17/01/09 18:12:00 INFO NettyBlockTransferService: Server created on 37751
17/01/09 18:12:00 INFO BlockManagerMaster: Trying to register BlockManager
17/01/09 18:12:00 INFO BlockManagerMasterEndpoint: Registering block manager localhost:37751 with 7.7 GB RAM, BlockManagerId(driver, localhost, 37751)
17/01/09 18:12:00 INFO BlockManagerMaster: Registered BlockManager
18:12:01.177 INFO CreatePanelOfNormals - QC: Beginning creation of QC PoN...
18:12:01.235 INFO HDF5PoNCreator - All 283 targets are kept
17/01/09 18:12:01 INFO SparkUI: Stopped Spark web UI at http://192.168.0.131:4040
17/01/09 18:12:01 INFO DAGScheduler: Stopping DAGScheduler
17/01/09 18:12:01 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
17/01/09 18:12:01 INFO MemoryStore: MemoryStore cleared
17/01/09 18:12:01 INFO BlockManager: BlockManager stopped
17/01/09 18:12:01 INFO BlockManagerMaster: BlockManagerMaster stopped
17/01/09 18:12:01 INFO SparkContext: Successfully stopped SparkContext
18:12:01.673 INFO CreatePanelOfNormals - Shutting down engine
17/01/09 18:12:01 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
[January 9, 2017 6:12:01 PM CST] org.broadinstitute.hellbender.tools.exome.CreatePanelOfNormals done. Elapsed time: 0.05 minutes.
Runtime.totalMemory()=2264399872
Exception in thread "main" java.lang.UnsatisfiedLinkError: ncsa.hdf.hdf5lib.H5.H5dont_atexit()I
at ncsa.hdf.hdf5lib.H5.H5dont_atexit(Native Method)
at ncsa.hdf.hdf5lib.H5.loadH5Lib(H5.java:365)
at ncsa.hdf.hdf5lib.H5.(H5.java:274)
at ncsa.hdf.hdf5lib.HDF5Constants.(HDF5Constants.java:28)
at org.broadinstitute.hellbender.utils.hdf5.HDF5File$OpenMode.(HDF5File.java:505)
at org.broadinstitute.hellbender.utils.hdf5.HDF5PoNCreator.writeTargetFactorNormalizeReadCountsAndTargetFactors(HDF5PoNCreator.java:185)
at org.broadinstitute.hellbender.utils.hdf5.HDF5PoNCreator.createPoNGivenReadCountCollection(HDF5PoNCreator.java:118)
at org.broadinstitute.hellbender.utils.hdf5.HDF5PoNCreator.createPoN(HDF5PoNCreator.java:88)
at org.broadinstitute.hellbender.tools.exome.CreatePanelOfNormals.runPipeline(CreatePanelOfNormals.java:244)
at org.broadinstitute.hellbender.utils.SparkToggleCommandLineProgram.doWork(SparkToggleCommandLineProgram.java:39)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.runTool(CommandLineProgram.java:102)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMainPostParseArgs(CommandLineProgram.java:155)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:174)
at org.broadinstitute.hellbender.Main.instanceMain(Main.java:69)
at org.broadinstitute.hellbender.Main.main(Main.java:84)
17/01/09 18:12:01 INFO ShutdownHookManager: Shutdown hook called
17/01/09 18:12:01 INFO ShutdownHookManager: Deleting directory /tmp/yangjiatao/spark-9dd82bfd-6256-43a9-a5c0-b8b831b0bc21

Best Answers

Answers

  • YujianYujian Member

    @LeeTL1220 Can you give me the links for downloading ? I have tried many times with some versions of HDF , but failed. And what do you mean by 'For most architectures, the next release of GATK-protected does not require you to specify the library, it will be built into the jar.' . Thanks very much.

  • YujianYujian Member
    @LeeTL1220 I adjusted and used HDF-2.11 , but something wrong still occured. As following :
    [January 11, 2017 11:24:15 AM CST] org.broadinstitute.hellbender.tools.exome.CreatePanelOfNormals --input merge.1.txt --output PoN.tsv --minimumTargetFactorPercentileThreshold 25.0 --maximumColumnZerosPercentage 2.0 --maximumTargetZerosPercentage 5.0 --extremeColumnMedianCountPercentileThreshold 2.5 --truncatePercentileThreshold 0.1 --numberOfEigenSamples auto --noQC false --dryRun false --disableSpark false --sparkMaster local[*] --help false --version false --verbosity INFO --QUIET false
    [January 11, 2017 11:24:15 AM CST] Executing as [email protected] on Linux 3.10.0-514.2.2.el7.x86_64 amd64; OpenJDK 64-Bit Server VM 1.8.0_111-b15; Version: Version:version-unknown-SNAPSHOT
    11:24:15.180 INFO CreatePanelOfNormals - Defaults.BUFFER_SIZE : 131072
    11:24:15.180 INFO CreatePanelOfNormals - Defaults.COMPRESSION_LEVEL : 5
    11:24:15.180 INFO CreatePanelOfNormals - Defaults.CREATE_INDEX : false
    11:24:15.181 INFO CreatePanelOfNormals - Defaults.CREATE_MD5 : false
    11:24:15.181 INFO CreatePanelOfNormals - Defaults.CUSTOM_READER_FACTORY :
    11:24:15.181 INFO CreatePanelOfNormals - Defaults.EBI_REFERENCE_SEVICE_URL_MASK : http://www.ebi.ac.uk/ena/cram/md5/%s
    11:24:15.181 INFO CreatePanelOfNormals - Defaults.INTEL_DEFLATER_SHARED_LIBRARY_PATH : null
    11:24:15.181 INFO CreatePanelOfNormals - Defaults.NON_ZERO_BUFFER_SIZE : 131072
    11:24:15.181 INFO CreatePanelOfNormals - Defaults.REFERENCE_FASTA : null
    11:24:15.181 INFO CreatePanelOfNormals - Defaults.TRY_USE_INTEL_DEFLATER : true
    11:24:15.181 INFO CreatePanelOfNormals - Defaults.USE_ASYNC_IO : false
    11:24:15.181 INFO CreatePanelOfNormals - Defaults.USE_ASYNC_IO_FOR_SAMTOOLS : false
    11:24:15.181 INFO CreatePanelOfNormals - Defaults.USE_ASYNC_IO_FOR_TRIBBLE : false
    11:24:15.181 INFO CreatePanelOfNormals - Defaults.USE_CRAM_REF_DOWNLOAD : false
    11:24:15.183 INFO CreatePanelOfNormals - Deflater JdkDeflater
    11:24:15.183 INFO CreatePanelOfNormals - Initializing engine
    11:24:15.183 INFO CreatePanelOfNormals - Done initializing engine
    Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
    17/01/11 11:24:15 INFO SparkContext: Running Spark version 1.5.0
    17/01/11 11:24:15 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    17/01/11 11:24:15 INFO SecurityManager: Changing view acls to: yangjiatao
    17/01/11 11:24:15 INFO SecurityManager: Changing modify acls to: yangjiatao
    17/01/11 11:24:15 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yangjiatao); users with modify permissions: Set(yangjiatao)
    17/01/11 11:24:16 INFO Slf4jLogger: Slf4jLogger started
    17/01/11 11:24:16 INFO Remoting: Starting remoting
    17/01/11 11:24:16 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://[email protected]:44781]
    17/01/11 11:24:16 INFO Utils: Successfully started service 'sparkDriver' on port 44781.
    17/01/11 11:24:16 INFO SparkEnv: Registering MapOutputTracker
    17/01/11 11:24:16 INFO SparkEnv: Registering BlockManagerMaster
    17/01/11 11:24:16 INFO DiskBlockManager: Created local directory at /tmp/yangjiatao/blockmgr-03b59930-03e4-40da-9e37-d48b0e41c9a3
    17/01/11 11:24:16 INFO MemoryStore: MemoryStore started with capacity 7.7 GB
    17/01/11 11:24:16 INFO HttpFileServer: HTTP File server directory is /tmp/yangjiatao/spark-f757d0b6-73d9-4ee7-9fe1-4639edb895d1/httpd-90d3431a-3071-4117-9e2d-68cf12daf944
    17/01/11 11:24:16 INFO HttpServer: Starting HTTP Server
    17/01/11 11:24:16 INFO Utils: Successfully started service 'HTTP file server' on port 33975.
    17/01/11 11:24:16 INFO SparkEnv: Registering OutputCommitCoordinator
    17/01/11 11:24:16 INFO Utils: Successfully started service 'SparkUI' on port 4040.
    17/01/11 11:24:16 INFO SparkUI: Started SparkUI at http://192.168.0.131:4040
    17/01/11 11:24:16 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
    17/01/11 11:24:16 INFO Executor: Starting executor ID driver on host localhost
    17/01/11 11:24:17 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 44847.
    17/01/11 11:24:17 INFO NettyBlockTransferService: Server created on 44847
    17/01/11 11:24:17 INFO BlockManagerMaster: Trying to register BlockManager
    17/01/11 11:24:17 INFO BlockManagerMasterEndpoint: Registering block manager localhost:44847 with 7.7 GB RAM, BlockManagerId(driver, localhost, 44847)
    17/01/11 11:24:17 INFO BlockManagerMaster: Registered BlockManager
    11:24:17.290 INFO CreatePanelOfNormals - QC: Beginning creation of QC PoN...
    11:24:17.336 INFO HDF5PoNCreator - Discarded 18 target(s) out of 76 with factors below 7.1e-05 (25.00 percentile)
    11:24:17.365 INFO HDF5PoNCreator - Creating /tmp/yangjiatao/qc-pon-40216574254072292.hd5...
    11:24:17.365 INFO HDF5PoNCreator - Setting sample names ...
    11:24:17.370 INFO HDF5PoNCreator - Setting target names ...
    11:24:17.374 INFO HDF5PoNCreator - Setting target factors (58) ...
    11:24:17.377 INFO HDF5PoNCreator - Setting coverage profile (58 x 2) (T)...
    11:24:17.380 INFO HDF5PoNCreator - Setting targets ...
    11:24:17.382 INFO HDF5PoNCreator - Setting raw targets ...
    11:24:17.383 INFO HDF5PoNCreator - Setting raw target names ...
    11:24:17.389 INFO HDF5PoNCreator - No count column dropped due to zero counts; there is no column with a large number of targets with zero counts (<= 2 of 58)
    11:24:17.392 INFO HDF5PoNCreator - No target drop due to too many zero counts; there is no target with a large number of columns with zero counts (<= 1 of 58)
    11:24:17.394 INFO HDF5PoNCreator - No column dropped due to extreme counts outside [0.9987096501, 1.0012903499]
    11:24:17.400 INFO HDF5PoNCreator - No count is 0.0 thus no count needed to be imputed.
    11:24:17.401 INFO HDF5PoNCreator - None of the 116 counts where truncated as they all fall in the non-extreme range [0.44, 1.56]
    11:24:17.402 INFO HDF5PoNCreator - Counts normalized by the column median and log2'd.
    11:24:17.403 INFO HDF5PoNCreator - Counts centered around the BGS center -0.00
    11:24:17.403 INFO HDF5PoNCreator - Starting the SVD decomposition of the log-normal counts ...
    11:24:17.403 INFO SparkConverter - Converting matrix to distributed Spark matrix...
    11:24:17.643 INFO SparkConverter - Done converting matrix to distributed Spark matrix...
    17/01/11 11:24:17 INFO SparkContext: Starting job: first at RowMatrix.scala:65
    17/01/11 11:24:17 INFO DAGScheduler: Got job 0 (first at RowMatrix.scala:65) with 1 output partitions
    17/01/11 11:24:17 INFO DAGScheduler: Final stage: ResultStage 0(first at RowMatrix.scala:65)
    17/01/11 11:24:17 INFO DAGScheduler: Parents of final stage: List()
    17/01/11 11:24:17 INFO DAGScheduler: Missing parents: List()
    17/01/11 11:24:17 INFO DAGScheduler: Submitting ResultStage 0 (ParallelCollectionRDD[0] at parallelize at SparkConverter.java:47), which has no missing parents
    17/01/11 11:24:17 INFO MemoryStore: ensureFreeSpace(1512) called with curMem=0, maxMem=8246588866
    17/01/11 11:24:17 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 1512.0 B, free 7.7 GB)
    17/01/11 11:24:18 INFO MemoryStore: ensureFreeSpace(952) called with curMem=1512, maxMem=8246588866
    17/01/11 11:24:18 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 952.0 B, free 7.7 GB)
    17/01/11 11:24:18 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:44847 (size: 952.0 B, free: 7.7 GB)
    17/01/11 11:24:18 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:861
    17/01/11 11:24:18 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (ParallelCollectionRDD[0] at parallelize at SparkConverter.java:47)
    17/01/11 11:24:18 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
    17/01/11 11:24:18 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, PROCESS_LOCAL, 2020 bytes)
    17/01/11 11:24:18 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
    17/01/11 11:24:18 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 894 bytes result sent to driver
    17/01/11 11:24:18 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 122 ms on localhost (1/1)
    17/01/11 11:24:18 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
    17/01/11 11:24:18 INFO DAGScheduler: ResultStage 0 (first at RowMatrix.scala:65) finished in 0.139 s
    17/01/11 11:24:18 INFO DAGScheduler: Job 0 finished: first at RowMatrix.scala:65, took 0.775961 s
    17/01/11 11:24:18 INFO SparkContext: Starting job: first at RowMatrix.scala:65
    17/01/11 11:24:18 INFO DAGScheduler: Got job 1 (first at RowMatrix.scala:65) with 4 output partitions
    17/01/11 11:24:18 INFO DAGScheduler: Final stage: ResultStage 1(first at RowMatrix.scala:65)
    17/01/11 11:24:18 INFO DAGScheduler: Parents of final stage: List()
    17/01/11 11:24:18 INFO DAGScheduler: Missing parents: List()
    17/01/11 11:24:18 INFO DAGScheduler: Submitting ResultStage 1 (ParallelCollectionRDD[0] at parallelize at SparkConverter.java:47), which has no missing parents
    17/01/11 11:24:18 INFO MemoryStore: ensureFreeSpace(1512) called with curMem=2464, maxMem=8246588866
    17/01/11 11:24:18 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 1512.0 B, free 7.7 GB)
    17/01/11 11:24:18 INFO MemoryStore: ensureFreeSpace(952) called with curMem=3976, maxMem=8246588866
    17/01/11 11:24:18 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 952.0 B, free 7.7 GB)
    17/01/11 11:24:18 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on localhost:44847 (size: 952.0 B, free: 7.7 GB)
    17/01/11 11:24:18 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:861
    17/01/11 11:24:18 INFO DAGScheduler: Submitting 4 missing tasks from ResultStage 1 (ParallelCollectionRDD[0] at parallelize at SparkConverter.java:47)
    17/01/11 11:24:18 INFO TaskSchedulerImpl: Adding task set 1.0 with 4 tasks
    17/01/11 11:24:18 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, PROCESS_LOCAL, 2082 bytes)
    17/01/11 11:24:18 INFO TaskSetManager: Starting task 1.0 in stage 1.0 (TID 2, localhost, PROCESS_LOCAL, 2082 bytes)
    17/01/11 11:24:18 INFO TaskSetManager: Starting task 2.0 in stage 1.0 (TID 3, localhost, PROCESS_LOCAL, 2082 bytes)
    17/01/11 11:24:18 INFO TaskSetManager: Starting task 3.0 in stage 1.0 (TID 4, localhost, PROCESS_LOCAL, 2082 bytes)
    17/01/11 11:24:18 INFO Executor: Running task 0.0 in stage 1.0 (TID 1)
    17/01/11 11:24:18 INFO Executor: Running task 1.0 in stage 1.0 (TID 2)
    17/01/11 11:24:18 INFO Executor: Running task 2.0 in stage 1.0 (TID 3)
    17/01/11 11:24:18 INFO Executor: Running task 3.0 in stage 1.0 (TID 4)
    17/01/11 11:24:18 INFO Executor: Finished task 2.0 in stage 1.0 (TID 3). 956 bytes result sent to driver
    17/01/11 11:24:18 INFO Executor: Finished task 0.0 in stage 1.0 (TID 1). 956 bytes result sent to driver
    17/01/11 11:24:18 INFO Executor: Finished task 1.0 in stage 1.0 (TID 2). 956 bytes result sent to driver
    17/01/11 11:24:18 INFO Executor: Finished task 3.0 in stage 1.0 (TID 4). 956 bytes result sent to driver
    17/01/11 11:24:18 INFO TaskSetManager: Finished task 2.0 in stage 1.0 (TID 3) in 55 ms on localhost (1/4)
    17/01/11 11:24:18 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 79 ms on localhost (2/4)
    17/01/11 11:24:18 INFO TaskSetManager: Finished task 1.0 in stage 1.0 (TID 2) in 70 ms on localhost (3/4)
    17/01/11 11:24:18 INFO TaskSetManager: Finished task 3.0 in stage 1.0 (TID 4) in 58 ms on localhost (4/4)
    17/01/11 11:24:18 INFO DAGScheduler: ResultStage 1 (first at RowMatrix.scala:65) finished in 0.082 s
    17/01/11 11:24:18 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool
    17/01/11 11:24:18 INFO DAGScheduler: Job 1 finished: first at RowMatrix.scala:65, took 0.100560 s
    17/01/11 11:24:18 INFO SparkContext: Starting job: treeAggregate at RowMatrix.scala:124
    17/01/11 11:24:18 INFO DAGScheduler: Registering RDD 2 (treeAggregate at RowMatrix.scala:124)
    17/01/11 11:24:18 INFO DAGScheduler: Got job 2 (treeAggregate at RowMatrix.scala:124) with 7 output partitions
    17/01/11 11:24:18 INFO DAGScheduler: Final stage: ResultStage 3(treeAggregate at RowMatrix.scala:124)
    17/01/11 11:24:18 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 2)
    17/01/11 11:24:18 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 2)
    17/01/11 11:24:18 INFO DAGScheduler: Submitting ShuffleMapStage 2 (MapPartitionsRDD[2] at treeAggregate at RowMatrix.scala:124), which has no missing parents
    17/01/11 11:24:18 INFO MemoryStore: ensureFreeSpace(3920) called with curMem=4928, maxMem=8246588866
    17/01/11 11:24:18 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 3.8 KB, free 7.7 GB)
    17/01/11 11:24:18 INFO MemoryStore: ensureFreeSpace(2176) called with curMem=8848, maxMem=8246588866
    17/01/11 11:24:18 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 2.1 KB, free 7.7 GB)
    17/01/11 11:24:18 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on localhost:44847 (size: 2.1 KB, free: 7.7 GB)
    17/01/11 11:24:18 INFO SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:861
    17/01/11 11:24:18 INFO DAGScheduler: Submitting 60 missing tasks from ShuffleMapStage 2 (MapPartitionsRDD[2] at treeAggregate at RowMatrix.scala:124)
    17/01/11 11:24:18 INFO TaskSchedulerImpl: Adding task set 2.0 with 60 tasks
    17/01/11 11:24:18 INFO TaskSetManager: Starting task 0.0 in stage 2.0 (TID 5, localhost, PROCESS_LOCAL, 2009 bytes)
    17/01/11 11:24:18 INFO TaskSetManager: Starting task 1.0 in stage 2.0 (TID 6, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:18 INFO TaskSetManager: Starting task 2.0 in stage 2.0 (TID 7, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:18 INFO TaskSetManager: Starting task 3.0 in stage 2.0 (TID 8, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:18 INFO TaskSetManager: Starting task 4.0 in stage 2.0 (TID 9, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:18 INFO TaskSetManager: Starting task 5.0 in stage 2.0 (TID 10, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:18 INFO TaskSetManager: Starting task 6.0 in stage 2.0 (TID 11, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:18 INFO TaskSetManager: Starting task 7.0 in stage 2.0 (TID 12, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:18 INFO TaskSetManager: Starting task 8.0 in stage 2.0 (TID 13, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:18 INFO TaskSetManager: Starting task 9.0 in stage 2.0 (TID 14, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:18 INFO TaskSetManager: Starting task 10.0 in stage 2.0 (TID 15, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:18 INFO TaskSetManager: Starting task 11.0 in stage 2.0 (TID 16, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:18 INFO TaskSetManager: Starting task 12.0 in stage 2.0 (TID 17, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 13.0 in stage 2.0 (TID 18, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 14.0 in stage 2.0 (TID 19, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 15.0 in stage 2.0 (TID 20, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 16.0 in stage 2.0 (TID 21, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 17.0 in stage 2.0 (TID 22, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 18.0 in stage 2.0 (TID 23, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 19.0 in stage 2.0 (TID 24, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 20.0 in stage 2.0 (TID 25, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 21.0 in stage 2.0 (TID 26, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 22.0 in stage 2.0 (TID 27, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 23.0 in stage 2.0 (TID 28, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 24.0 in stage 2.0 (TID 29, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 25.0 in stage 2.0 (TID 30, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 26.0 in stage 2.0 (TID 31, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 27.0 in stage 2.0 (TID 32, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 28.0 in stage 2.0 (TID 33, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 29.0 in stage 2.0 (TID 34, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 30.0 in stage 2.0 (TID 35, localhost, PROCESS_LOCAL, 2009 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 31.0 in stage 2.0 (TID 36, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 32.0 in stage 2.0 (TID 37, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 33.0 in stage 2.0 (TID 38, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 34.0 in stage 2.0 (TID 39, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 35.0 in stage 2.0 (TID 40, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 36.0 in stage 2.0 (TID 41, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 37.0 in stage 2.0 (TID 42, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 38.0 in stage 2.0 (TID 43, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 39.0 in stage 2.0 (TID 44, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 40.0 in stage 2.0 (TID 45, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 41.0 in stage 2.0 (TID 46, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 42.0 in stage 2.0 (TID 47, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 43.0 in stage 2.0 (TID 48, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 44.0 in stage 2.0 (TID 49, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 45.0 in stage 2.0 (TID 50, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 46.0 in stage 2.0 (TID 51, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 47.0 in stage 2.0 (TID 52, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 48.0 in stage 2.0 (TID 53, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 49.0 in stage 2.0 (TID 54, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 50.0 in stage 2.0 (TID 55, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 51.0 in stage 2.0 (TID 56, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 52.0 in stage 2.0 (TID 57, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 53.0 in stage 2.0 (TID 58, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 54.0 in stage 2.0 (TID 59, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 55.0 in stage 2.0 (TID 60, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 56.0 in stage 2.0 (TID 61, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 57.0 in stage 2.0 (TID 62, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 58.0 in stage 2.0 (TID 63, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO TaskSetManager: Starting task 59.0 in stage 2.0 (TID 64, localhost, PROCESS_LOCAL, 2071 bytes)
    17/01/11 11:24:19 INFO Executor: Running task 3.0 in stage 2.0 (TID 8)
    17/01/11 11:24:19 INFO Executor: Running task 1.0 in stage 2.0 (TID 6)
    17/01/11 11:24:19 INFO Executor: Running task 0.0 in stage 2.0 (TID 5)
    17/01/11 11:24:19 INFO Executor: Running task 2.0 in stage 2.0 (TID 7)
    17/01/11 11:24:19 INFO Executor: Running task 7.0 in stage 2.0 (TID 12)
    17/01/11 11:24:19 INFO Executor: Running task 5.0 in stage 2.0 (TID 10)
    17/01/11 11:24:19 INFO Executor: Running task 6.0 in stage 2.0 (TID 11)
    17/01/11 11:24:19 INFO Executor: Running task 4.0 in stage 2.0 (TID 9)
    17/01/11 11:24:19 INFO Executor: Running task 9.0 in stage 2.0 (TID 14)
    17/01/11 11:24:19 INFO Executor: Running task 8.0 in stage 2.0 (TID 13)
    17/01/11 11:24:19 INFO Executor: Running task 11.0 in stage 2.0 (TID 16)
    17/01/11 11:24:19 INFO Executor: Running task 10.0 in stage 2.0 (TID 15)
    17/01/11 11:24:19 INFO Executor: Running task 12.0 in stage 2.0 (TID 17)
    17/01/11 11:24:19 INFO Executor: Running task 13.0 in stage 2.0 (TID 18)
    17/01/11 11:24:19 INFO Executor: Running task 14.0 in stage 2.0 (TID 19)
    17/01/11 11:24:19 INFO Executor: Running task 15.0 in stage 2.0 (TID 20)
    17/01/11 11:24:19 INFO Executor: Running task 16.0 in stage 2.0 (TID 21)
    17/01/11 11:24:19 INFO Executor: Running task 17.0 in stage 2.0 (TID 22)
    17/01/11 11:24:19 INFO Executor: Running task 19.0 in stage 2.0 (TID 24)
    17/01/11 11:24:19 INFO Executor: Running task 18.0 in stage 2.0 (TID 23)
    17/01/11 11:24:19 INFO Executor: Running task 20.0 in stage 2.0 (TID 25)
    17/01/11 11:24:19 INFO Executor: Running task 21.0 in stage 2.0 (TID 26)
    17/01/11 11:24:19 INFO Executor: Running task 22.0 in stage 2.0 (TID 27)
    17/01/11 11:24:19 INFO Executor: Running task 23.0 in stage 2.0 (TID 28)
    17/01/11 11:24:19 INFO Executor: Running task 24.0 in stage 2.0 (TID 29)
    17/01/11 11:24:19 INFO Executor: Running task 25.0 in stage 2.0 (TID 30)
    17/01/11 11:24:19 INFO Executor: Running task 26.0 in stage 2.0 (TID 31)
    17/01/11 11:24:19 INFO Executor: Running task 27.0 in stage 2.0 (TID 32)
    17/01/11 11:24:19 INFO Executor: Running task 28.0 in stage 2.0 (TID 33)
    17/01/11 11:24:19 INFO Executor: Running task 29.0 in stage 2.0 (TID 34)
    17/01/11 11:24:19 INFO Executor: Running task 30.0 in stage 2.0 (TID 35)
    17/01/11 11:24:19 INFO Executor: Running task 31.0 in stage 2.0 (TID 36)
    17/01/11 11:24:19 INFO Executor: Running task 32.0 in stage 2.0 (TID 37)
    17/01/11 11:24:19 INFO Executor: Running task 33.0 in stage 2.0 (TID 38)
    17/01/11 11:24:19 INFO Executor: Running task 34.0 in stage 2.0 (TID 39)
    17/01/11 11:24:19 INFO Executor: Running task 35.0 in stage 2.0 (TID 40)
    17/01/11 11:24:19 INFO Executor: Running task 36.0 in stage 2.0 (TID 41)
    17/01/11 11:24:19 INFO Executor: Running task 38.0 in stage 2.0 (TID 43)
    17/01/11 11:24:19 INFO Executor: Running task 39.0 in stage 2.0 (TID 44)
    17/01/11 11:24:19 INFO Executor: Running task 37.0 in stage 2.0 (TID 42)
    17/01/11 11:24:19 INFO Executor: Running task 40.0 in stage 2.0 (TID 45)
    17/01/11 11:24:19 INFO Executor: Running task 42.0 in stage 2.0 (TID 47)
    17/01/11 11:24:19 INFO Executor: Running task 43.0 in stage 2.0 (TID 48)
    17/01/11 11:24:19 INFO Executor: Running task 44.0 in stage 2.0 (TID 49)
    17/01/11 11:24:19 INFO Executor: Running task 45.0 in stage 2.0 (TID 50)
    17/01/11 11:24:19 INFO Executor: Running task 46.0 in stage 2.0 (TID 51)
    17/01/11 11:24:19 INFO Executor: Running task 47.0 in stage 2.0 (TID 52)
    17/01/11 11:24:19 INFO Executor: Running task 48.0 in stage 2.0 (TID 53)
    17/01/11 11:24:19 INFO Executor: Running task 49.0 in stage 2.0 (TID 54)
    17/01/11 11:24:19 INFO Executor: Running task 50.0 in stage 2.0 (TID 55)
    17/01/11 11:24:19 INFO Executor: Running task 41.0 in stage 2.0 (TID 46)
    17/01/11 11:24:19 INFO Executor: Running task 52.0 in stage 2.0 (TID 57)
    17/01/11 11:24:19 INFO Executor: Running task 53.0 in stage 2.0 (TID 58)
    17/01/11 11:24:19 INFO Executor: Running task 54.0 in stage 2.0 (TID 59)
    17/01/11 11:24:19 INFO Executor: Running task 55.0 in stage 2.0 (TID 60)
    17/01/11 11:24:19 INFO Executor: Running task 56.0 in stage 2.0 (TID 61)
    17/01/11 11:24:19 INFO Executor: Running task 58.0 in stage 2.0 (TID 63)
    17/01/11 11:24:19 INFO Executor: Running task 59.0 in stage 2.0 (TID 64)
    17/01/11 11:24:19 INFO Executor: Running task 57.0 in stage 2.0 (TID 62)
    17/01/11 11:24:19 INFO Executor: Running task 51.0 in stage 2.0 (TID 56)
    Jan 11, 2017 11:24:19 AM com.github.fommil.jni.JniLoader liberalLoad
    INFO: successfully loaded /tmp/yangjiatao/jniloader613984611960445225netlib-native_system-linux-x86_64.so
    java: symbol lookup errorjava: javajava/tmp/yangjiatao/jniloader613984611960445225netlib-native_system-linux-x86_64.so: : symbol lookup errorundefined symbol: cblas_dspr: /tmp/yangjiatao/jniloader613984611960445225netlib-native_system-linux-x86_64.so
    java: : undefined symbol: cblas_dspr: symbol lookup error: : symbol lookup error/tmp/yangjiatao/jniloader613984611960445225netlib-native_system-linux-x86_64.so:
    /tmp/yangjiatao/jniloader613984611960445225netlib-native_system-linux-x86_64.sojavaundefined symbol: cblas_dspr: symbol lookup errorundefined symbol: cblas_dspr: symbol lookup error
    java: symbol lookup error: /tmp/yangjiatao/jniloader613984611960445225netlib-native_system-linux-x86_64.so: undefined symbol: cblas_dspr
    /tmp/yangjiatao/jniloader613984611960445225netlib-native_system-linux-x86_64.so
  • Geraldine_VdAuweraGeraldine_VdAuwera Cambridge, MAMember, Administrator, Broadie admin
    @Yujian, please stop posting multiple copies of the same questions. @LeeTL1220 will try to help you in this thread but it may take some time. Posting the same thing elsewhere will not get you an answer faster; it might in fact get you banned.
  • LeeTL1220LeeTL1220 Arlington, MAMember, Broadie, Dev ✭✭✭

    @Yujian @Geraldine_VdAuwera This is a followup question and I am unaware of the duplication. Regardless, I will answer here.

  • LeeTL1220LeeTL1220 Arlington, MAMember, Broadie, Dev ✭✭✭

    @Yujian This is a bit more difficult. I have not seen this issue, but let me see if I can help.... However,I need to ask some questions:

    1) What architecture are you running on? Do you have root (or sudo) access?

    2) How did you install HDF5?

    3) Do you have access to docker?

  • YujianYujian Member

    @LeeTL1220 I have root access. What you mean HDF5 ? I don't know wether HDFView equal HDF5. And wether docker is a linux command ?

  • LeeTL1220LeeTL1220 Arlington, MAMember, Broadie, Dev ✭✭✭

    @Yujian How did you install HDFView? Are you running Ubuntu?

  • y_zhang88y_zhang88 Member
    edited May 2018

    @LeeTL1220
    I met this problem too. it was running very well with one sample input, but this bug appeared when I input multiple samples... BTW, my version is 4.0.3.0.
    It seems related to Spark, and I just solved it.
    1. install libblas.so, liblapacke.so and libopenblas.so(which I lacked).
    2. add to environment. export LD_PRELOAD=/path/to/libopenblas.so
    Then everything works as expected.

  • SheilaSheila Broad InstituteMember, Broadie, Moderator admin

    @y_zhang88
    Hi,

    Thanks for reporting your solution :smile:

    -Sheila

Sign In or Register to comment.