You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm experiencing issues when enabling the use of spark (--use_gatk_spark).
I edited the files /etc/security/limits.conf and /etc/sysctl.conf on computing nodes as suggested in https://nf-co.re/sarek/usage#spark-related-issues, but not /etc/sysconfig/docker since I'm using singularity (should I edit a different file?).
Steps to reproduce
Steps to reproduce the behavior (I added the SINGULARITYENV_* variables according to #295 (comment)):
12:36:04.453 INFO MarkDuplicatesSpark - ------------------------------------------------------------
12:36:04.454 INFO MarkDuplicatesSpark - The Genome Analysis Toolkit (GATK) v4.1.7.0
12:36:04.454 INFO MarkDuplicatesSpark - For support and documentation go to https://software.broadinstitute.org/gatk/
12:36:04.458 INFO MarkDuplicatesSpark - Initializing engine
12:36:04.458 INFO MarkDuplicatesSpark - Done initializing engine
12:36:04.691 INFO MarkDuplicatesSpark - Shutting down engine
[May 23, 2022 12:36:04 PM UTC] org.broadinstitute.hellbender.tools.spark.transforms.markduplicates.MarkDuplicatesSpark done. Elapse
d time: 0.01 minutes.
Runtime.totalMemory()=4557111296
Exception in thread "main" java.lang.ExceptionInInitializerError
at org.apache.spark.SparkConf$.<init>(SparkConf.scala:716)
at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala)
at org.apache.spark.SparkConf.set(SparkConf.scala:95)
at org.apache.spark.SparkConf$$anonfun$loadFromSystemProperties$3.apply(SparkConf.scala:77)
at org.apache.spark.SparkConf$$anonfun$loadFromSystemProperties$3.apply(SparkConf.scala:76)
at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
at scala.collection.immutable.HashMap$HashMap1.foreach(HashMap.scala:221)
at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428)
at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:76)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:71)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:58)
at org.broadinstitute.hellbender.engine.spark.SparkContextFactory.setupSparkConf(SparkContextFactory.java:173)
at org.broadinstitute.hellbender.engine.spark.SparkContextFactory.createSparkContext(SparkContextFactory.java:183)
at org.broadinstitute.hellbender.engine.spark.SparkContextFactory.getSparkContext(SparkContextFactory.java:117)
at org.broadinstitute.hellbender.engine.spark.SparkCommandLineProgram.doWork(SparkCommandLineProgram.java:28)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.runTool(CommandLineProgram.java:139)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMainPostParseArgs(CommandLineProgram.java:191)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:210)
at org.broadinstitute.hellbender.Main.runCommandLineProgram(Main.java:163)
at org.broadinstitute.hellbender.Main.mainEntry(Main.java:206)
at org.broadinstitute.hellbender.Main.main(Main.java:292)
Caused by: java.net.UnknownHostException: nodo01: nodo01: No address associated with hostname
at java.net.InetAddress.getLocalHost(InetAddress.java:1506)
at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:946)
at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:939)
at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:939)
at org.apache.spark.util.Utils$$anonfun$localCanonicalHostName$1.apply(Utils.scala:996)
at org.apache.spark.util.Utils$$anonfun$localCanonicalHostName$1.apply(Utils.scala:996)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.util.Utils$.localCanonicalHostName(Utils.scala:996)
at org.apache.spark.internal.config.package$.<init>(package.scala:302)
at org.apache.spark.internal.config.package$.<clinit>(package.scala)
... 23 more
Caused by: java.net.UnknownHostException: nodo01: No address associated with hostname
at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:929)
at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1324)
at java.net.InetAddress.getLocalHost(InetAddress.java:1501)
... 32 more
Check Documentation
I have checked the following places for your error:
Description of the bug
I'm experiencing issues when enabling the use of spark (
--use_gatk_spark
).I edited the files
/etc/security/limits.conf
and/etc/sysctl.conf
on computing nodes as suggested in https://nf-co.re/sarek/usage#spark-related-issues, but not/etc/sysconfig/docker
since I'm using singularity (should I edit a different file?).Steps to reproduce
Steps to reproduce the behavior (I added the SINGULARITYENV_* variables according to #295 (comment)):
Log files
log.txt
nextflow.log.txt
Have you provided the following extra information/files:
.nextflow.log
fileSystem
Nextflow Installation
Container engine
Should I increase the java memory options or is it a bug?
The text was updated successfully, but these errors were encountered: