无法导入 SparkContext

Mub*_*bin 2 python mapr apache-spark pyspark

我正在 CentOS 上工作,我已经设置$SPARK_HOME并添加了binin的路径$PATH

我可以pyspark从任何地方逃跑。

但是当我尝试创建python文件并使用此语句时;

from pyspark import SparkConf, SparkContext
Run Code Online (Sandbox Code Playgroud)

它抛出以下错误

python pysparktask.py
    Traceback (most recent call last):
    File "pysparktask.py", line 1, in <module>
      from pyspark import SparkConf, SparkContext
    ModuleNotFoundError: No module named 'pyspark'
Run Code Online (Sandbox Code Playgroud)

我尝试使用 再次安装它pip

pip install pyspark
Run Code Online (Sandbox Code Playgroud)

它也给出了这个错误。

找不到满足 pyspark 要求的版本(来自版本:)没有找到 pyspark 的匹配发行版

编辑

根据答案,我更新了代码。

错误是

Traceback (most recent call last):
  File "pysparktask.py", line 6, in <module>
    from pyspark import SparkConf, SparkContext
  File "/opt/mapr/spark/spark-2.0.1/python/pyspark/__init__.py", line 44, in <module>
    from pyspark.context import SparkContext
  File "/opt/mapr/spark/spark-2.0.1/python/pyspark/context.py", line 33, in <module>
    from pyspark.java_gateway import launch_gateway
  File "/opt/mapr/spark/spark-2.0.1/python/pyspark/java_gateway.py", line 31, in <module>
    from py4j.java_gateway import java_import, JavaGateway, GatewayClient
ModuleNotFoundError: No module named 'py4j'
Run Code Online (Sandbox Code Playgroud)

Afa*_*faq 5

添加以下环境变量并将spark的lib路径附加到sys.path

import os
import sys

os.environ['SPARK_HOME'] = "/usr/lib/spark/"
sys.path.append("/usr/lib/spark/python/")

from pyspark import SparkConf, SparkContext # And then try to import SparkContext.
Run Code Online (Sandbox Code Playgroud)