小编sir*_*ine的帖子

无法从 JAR 文件加载主类

我有一个应用程序 Spark-scala,我试图显示一条简单的消息“你好我的应用程序”。当我通过 sbt compile 编译它时很好,我也通过 sbt run 运行它很好,我成功地显示了我的消息,但他显示了错误;像这样:

Hello my application!
16/11/27 15:17:11 ERROR Utils: uncaught error in thread SparkListenerBus,   stopping SparkContext
        java.lang.InterruptedException
     ERROR ContextCleaner: Error in cleaning thread
    java.lang.InterruptedException
     at org.apache.spark.ContextCleaner$$anon$1.run(ContextCleaner.scala:67)
    16/11/27 15:17:11 INFO SparkUI: Stopped Spark web UI at http://10.0.2.15:4040
    [success] Total time: 13 s, completed Nov 27, 2016 3:17:12 PM
    16/11/27 15:17:12 INFO DiskBlockManager: Shutdown hook called
Run Code Online (Sandbox Code Playgroud)

看不懂,好不好!此外,当我尝试在运行后加载我的文件 jar 时,他也显示了一个错误:

我的命令行看起来像:

spark-submit "appfilms" --master local[4] target/scala-2.11/system-of-recommandation_2.11-1.0.jar
Run Code Online (Sandbox Code Playgroud)

错误是:

Error: Cannot load main class from JAR file:/root/projectFilms/appfilms
Run with …
Run Code Online (Sandbox Code Playgroud)

hadoop scala sbt apache-spark

7
推荐指数
1
解决办法
3万
查看次数

如何访问 Spark Web UI?

我正在本地运行 4 个节点的 Spark 应用程序。当我运行我的应用程序时,它显示我的驱动程序具有这个地址 10.0.2.15:

INFO Utils: Successfully started service 'SparkUI' on port 4040.
INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.0.2.15:4040
Run Code Online (Sandbox Code Playgroud)

在运行结束时显示:

INFO SparkUI: Stopped Spark web UI at http://10.0.2.15:4040
INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
INFO MemoryStore: MemoryStore cleared
INFO BlockManager: BlockManager stopped
INFO BlockManagerMaster: BlockManagerMaster stopped
INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
INFO SparkContext: Successfully stopped SparkContext
Run Code Online (Sandbox Code Playgroud)

我试图通过以下方式访问 Spark Web: 10.0.2.15:4040但该页面无法访问。尝试使用以下地址也没有帮助:

 http://localhost:18080
Run Code Online (Sandbox Code Playgroud)

使用ping 10.0.2.15结果是:

Send a request 'Ping' 10.0.2.15 with 32 bytes of data

Waiting …
Run Code Online (Sandbox Code Playgroud)

apache-spark

6
推荐指数
1
解决办法
4万
查看次数

java.lang.IllegalArgumentException: java.net.UnknownHostException: tmp

我工作一个应用程序 Spark-scala 并使用 sbt 构建项目,我的树状结构是: projectFilms/src/main/scala/AppFilms 我在 HDFS 中有 3 个文件,这些目录是:hdfs/tmp/projetFilms/<my_3_Files>,当我通过此命令行“sbt run”运行我的代码时,它生成一个错误:

java.lang.IllegalArgumentException: java.net.UnknownHostException: tmp
Run Code Online (Sandbox Code Playgroud)

和这个:

  [trace] Stack trace suppressed: run last compile:run for the full output.
 ERROR Utils: uncaught error in thread SparkListenerBus, stopping SparkContext
java.lang.InterruptedException
Run Code Online (Sandbox Code Playgroud)

这是我的代码:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.rdd._   

object appFilms {

val conf = new SparkConf().setAppName("system of recommandation").setMaster("local[*]")
val sc = new SparkContext(conf)
def main(args: Array[String]) {

val files = sc.wholeTextFiles("hdfs://tmp/ProjetFilm/*.dat")
//val nbfiles = files.count
println("Hello my application!")
sc.stop()
} …
Run Code Online (Sandbox Code Playgroud)

scala sbt apache-spark

4
推荐指数
1
解决办法
1万
查看次数

错误ContextCleaner:清理线程时出错

我有一个带有spark 1.4.1和scala 2.11的项目,当我使用sbt run(sbt 0.13.12)运行它时,显示错误如下:

16/12/22 15:36:43 ERROR ContextCleaner: Error in cleaning thread
java.lang.InterruptedException
        at java.lang.Object.wait(Native Method)
        at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:135)
        at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:175)
        at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1249)
        at org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:172)
        at org.apache.spark.ContextCleaner$$anon$1.run(ContextCleaner.scala:67)
16/12/22 15:36:43 ERROR Utils: uncaught error in thread SparkListenerBus, stopping SparkContext
java.lang.InterruptedException
        at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:996)
        at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1303)
        at java.util.concurrent.Semaphore.acquire(Semaphore.java:317)
        at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(LiveListenerBus.scala:80)
        at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:79)
        at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:79)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
        at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:78)
        at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1249)
        at org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:77)

Exception: sbt.TrapExitSecurityException thrown from the UncaughtExceptionHandler in thread "run-main-0"
16/12/22 15:36:43 ERROR ContextCleaner: Error in cleaning thread
java.lang.InterruptedException
        at java.lang.Object.wait(Native Method) …
Run Code Online (Sandbox Code Playgroud)

scala apache-spark

4
推荐指数
1
解决办法
2541
查看次数

标签 统计

apache-spark ×4

scala ×3

sbt ×2

hadoop ×1