小编Arv*_*ind的帖子

Spark上下文'sc'未定义

我是Spark的新手,我正在尝试通过参考以下网站来安装PySpark.

http://ramhiser.com/2015/02/01/configuring-ipython-notebook-support-for-pyspark/

我试图安装两个预构建的包,也通过SBT构建Spark包.

当我尝试在IPython Notebook中运行python代码时,我得到以下错误.

    NameError                                 Traceback (most recent call last)
   <ipython-input-1-f7aa330f6984> in <module>()
      1 # Check that Spark is working
----> 2 largeRange = sc.parallelize(xrange(100000))
      3 reduceTest = largeRange.reduce(lambda a, b: a + b)
      4 filterReduceTest = largeRange.filter(lambda x: x % 7 == 0).sum()
      5 

      NameError: name 'sc' is not defined
Run Code Online (Sandbox Code Playgroud)

在命令窗口中,我可以看到以下错误.

<strong>Failed to find Spark assembly JAR.</strong>
<strong>You need to build Spark before running this program.</strong>
Run Code Online (Sandbox Code Playgroud)

请注意,当我执行spark-shell命令时,我得到了一个scala提示符

更新:

在朋友的帮助下,我能够通过更正.ipython/profile_pyspark/startup/00-pyspark-setup.py文件的内容来解决与Spark程序集JAR相关的问题.

我现在只有Spark Context变量的问题.更改标题以适当反映我当前的问题.

ipython-notebook pyspark

20
推荐指数
4
解决办法
6万
查看次数

标签 统计

ipython-notebook ×1

pyspark ×1