是否可以扩展SLURM中JobName命令列中使用的字符数sacct?
例如,我目前有:
JobID JobName Elapsed NCPUS NTasks State
------------ ---------- ---------- ---------- -------- ----------
12345 lengthy_na+ 00:00:01 4 1 FAILED
Run Code Online (Sandbox Code Playgroud)
而且我想:
JobID JobName Elapsed NCPUS NTasks State
------------ ---------- ---------- ---------- -------- ----------
12345 lengthy_name 00:00:01 4 1 FAILED
Run Code Online (Sandbox Code Playgroud) 我在Scala中编写了一个使用Spark的应用程序.我正在使用Maven打包应用程序,并在构建"超级"或"胖"jar时遇到问题.
我面临的问题是运行应用程序在IDE内运行正常,或者如果我提供非超级jar版本的依赖项作为java类路径,但如果我将uber jar作为uber jar,它就不起作用类路径,即
java -Xmx2G -cp target/spark-example-0.1-SNAPSHOT-jar-with-dependencies.jar debug.spark_example.Example data.txt
Run Code Online (Sandbox Code Playgroud)
不起作用.我收到以下错误消息:
ERROR SparkContext: Error initializing SparkContext.
com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka.version'
at com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:124)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:145)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:151)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:159)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:164)
at com.typesafe.config.impl.SimpleConfig.getString(SimpleConfig.java:206)
at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:168)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:504)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:141)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:118)
at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:122)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1991)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1982)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
at org.apache.spark.rpc.akka.AkkaRpcEnvFactory.create(AkkaRpcEnv.scala:245)
at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:52)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:247)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:188)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
at debug.spark_example.Example$.main(Example.scala:9)
at debug.spark_example.Example.main(Example.scala)
Run Code Online (Sandbox Code Playgroud)
我非常感谢帮助理解我需要添加到pom.xml文件中的内容以及为什么我需要添加它以使其工作.
我在线搜索并找到了以下资源,我尝试过(参见pom),但无法开始工作: …