启动火花壳时出错

JRR*_*JRR 5 apache-spark

我刚下载了最新版本的spark,当我启动spark shell时出现以下错误:

java.net.BindException: Failed to bind to: /192.168.1.254:0: Service 'sparkDriver' failed after 16 retries!
    at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:272)
    at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:393)
    at akka.remote.transport.netty.NettyTransport$$anonfun$listen$1.apply(NettyTransport.scala:389)

...
...

java.lang.NullPointerException
    at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:193)
    at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:71)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
    at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
    at $iwC$$iwC.<init>(<console>:9)
...
...
<console>:10: error: not found: value sqlContext
       import sqlContext.implicits._
              ^
<console>:10: error: not found: value sqlContext
       import sqlContext.sql
              ^
Run Code Online (Sandbox Code Playgroud)

在设置火花时我有没有错过的东西?

dpe*_*ock 1

请参阅SPARK-8162

它看起来只影响 1.4.1 和 1.5.0 - 您最好运行最新版本(撰写本文时为 1.4.0)。