相关疑难解决方法(0)

无法安装Pyspark

我想使用pyspark在本地计算机上运行Spark.从这里我使用命令:

sbt/sbt assembly
$ ./bin/pyspark 
Run Code Online (Sandbox Code Playgroud)

安装完成,但pyspark无法运行,导致以下错误(完整):

138:spark-0.9.1 comp_name$ ./bin/pyspark
Python 2.7.6 |Anaconda 1.9.2 (x86_64)| (default, Jan 10 2014, 11:23:15) 
[GCC 4.0.1 (Apple Inc. build 5493)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
Traceback (most recent call last):
  File "/Users/comp_name/Downloads/spark-0.9.1/python/pyspark/shell.py", line 32, in <module>
    sc = SparkContext(os.environ.get("MASTER", "local"), "PySparkShell", pyFiles=add_files)
  File "/Users/comp_name/Downloads/spark-0.9.1/python/pyspark/context.py", line 123, in __init__
    self._jsc = self._jvm.JavaSparkContext(self._conf._jconf)
  File "/Users/comp_name/Downloads/spark-0.9.1/python/lib/py4j-0.8.1-src.zip/py4j/java_gateway.py", line 669, in __call__
  File "/Users/comp_name/Downloads/spark-0.9.1/python/lib/py4j-0.8.1-src.zip/py4j/protocol.py", line 300, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred …
Run Code Online (Sandbox Code Playgroud)

python apache-spark

5
推荐指数
3
解决办法
1万
查看次数

标签 统计

apache-spark ×1

python ×1