小编cal*_*man的帖子

Spark-submit无法导入SparkContext

我在我的本地Mac笔记本电脑上运行Spark 1.4.1,并且能够pyspark交互使用而没有任何问题.Spark是通过Homebrew安装的,我使用的是Anaconda Python.但是,一旦我尝试使用spark-submit,我收到以下错误:

15/09/04 08:51:09 ERROR SparkContext: Error initializing SparkContext.
java.io.FileNotFoundException: Added file file:test.py does not exist.
    at org.apache.spark.SparkContext.addFile(SparkContext.scala:1329)
    at org.apache.spark.SparkContext.addFile(SparkContext.scala:1305)
    at org.apache.spark.SparkContext$$anonfun$15.apply(SparkContext.scala:458)
    at org.apache.spark.SparkContext$$anonfun$15.apply(SparkContext.scala:458)
    at scala.collection.immutable.List.foreach(List.scala:318)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:458)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
    at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
    at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
    at py4j.Gateway.invoke(Gateway.java:214)
    at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
    at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
    at py4j.GatewayConnection.run(GatewayConnection.java:207)
    at java.lang.Thread.run(Thread.java:745)
15/09/04 08:51:09 ERROR SparkContext: Error stopping SparkContext after init error.
java.lang.NullPointerException
    at org.apache.spark.network.netty.NettyBlockTransferService.close(NettyBlockTransferService.scala:152)
    at org.apache.spark.storage.BlockManager.stop(BlockManager.scala:1216)
    at org.apache.spark.SparkEnv.stop(SparkEnv.scala:96)
    at org.apache.spark.SparkContext.stop(SparkContext.scala:1659)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:565)
    at …
Run Code Online (Sandbox Code Playgroud)

python anaconda apache-spark pyspark

6
推荐指数
1
解决办法
8740
查看次数

标签 统计

anaconda ×1

apache-spark ×1

pyspark ×1

python ×1