使用全新安装的Spark 2.1,执行pyspark命令时出错.
Traceback (most recent call last):
File "/usr/local/spark/python/pyspark/shell.py", line 43, in <module>
spark = SparkSession.builder\
File "/usr/local/spark/python/pyspark/sql/session.py", line 179, in getOrCreate
session._jsparkSession.sessionState().conf().setConfString(key, value)
File "/usr/local/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1133, in __call__
File "/usr/local/spark/python/pyspark/sql/utils.py", line 79, in deco
raise IllegalArgumentException(s.split(': ', 1)[1], stackTrace)
pyspark.sql.utils.IllegalArgumentException: u"Error while instantiating 'org.apache.spark.sql.hive.HiveSessionState':"
Run Code Online (Sandbox Code Playgroud)
我在同一台机器上安装了Hadoop和Hive.Hive配置为使用MySQL进行Metastore.我没有得到Spark 2.0.2的这个错误.
有人可以指点我正确的方向吗?
apache-spark ×1