Pan*_*mar 13 scala apache-spark
我已安装以下版本:Hadoop版本1.0.3 java版本"1.7.0_67"Scala版本2.11.7 Spark版本2.1.1.
低于错误,任何人都可以帮助我.
root@sparkmaster:/home/user# spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/07/05 01:07:35 WARN SparkContext: Support for Java 7 is deprecated as of Spark 2.0.0
17/07/05 01:07:36 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/07/05 01:07:37 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
17/07/05 01:07:37 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (starting from 0)! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing
<console>:14: error: not found: value spark
import spark.implicits._
<console>:14: error: not found: value spark
import spark.sql
Using Scala version 2.11.8 (Java HotSpot(TM) Client VM, Java 1.7.0_67)
Type in expressions to have them evaluated.
Type :help for more information.
scala>
Run Code Online (Sandbox Code Playgroud)
hi-*_*zir 40
有几种不同的解决方案
获取您的主机名
$ hostname
Run Code Online (Sandbox Code Playgroud)
然后尝试分配您的主机名
$ sudo hostname -s 127.0.0.1
Run Code Online (Sandbox Code Playgroud)
开始spark-shell
.
将主机名添加到/ etc/hosts文件(如果不存在)
127.0.0.1 your_hostname
Run Code Online (Sandbox Code Playgroud)添加env变量
export SPARK_LOCAL_IP="127.0.0.1"
load-spark-env.sh
Run Code Online (Sandbox Code Playgroud)上面的步骤解决了我的问题,但你也可以尝试添加
export SPARK_LOCAL_IP=127.0.0.1
Run Code Online (Sandbox Code Playgroud)
在模板文件spark-env.sh.template
(/usr/local/Cellar/apache-spark/2.1.0/libexec/conf/
)上的本地IP注释下
然后
cp spark-env.sh.template spark-env.sh
spark-shell
Run Code Online (Sandbox Code Playgroud)如果以上都没有修复,请检查防火墙并启用它(如果尚未启用)
Sha*_*ala 12
添加SPARK_LOCAL_IP
在load-spark-env.sh
如
export SPARK_LOCAL_IP="127.0.0.1"
Run Code Online (Sandbox Code Playgroud)
该load-spark-env.sh
文件位于spark/bin
目录中
或者您可以将您hostname
的/etc/hosts
文件添加为
127.0.0.1 hostname
Run Code Online (Sandbox Code Playgroud)
您可以hostname
输入hostname
终端来获取
希望这能解决问题!
归档时间: |
|
查看次数: |
16274 次 |
最近记录: |