the*_*ech 5 python pythonpath apache-spark
我正在努力为我的python路径添加火花:
(myenv)me@me /home/me$ set SPARK_HOME="/home/me/spark-1.2.1-bin-hadoop2.4"
(myenv)me@me /home/me$ set PYTHONPATH=$PYTHONPATH:$SPARK_HOME:$SPARK_HOME/python:$SPARK_HOME/python/build:$SPARK_HOME/bin
(myenv)me@me /home/me$ python -c 'import sys; print(sys.path)'
['', '/home/me/.virtualenvs/default/lib/python2.7', '/home/me/.virtualenvs/default/lib/python2.7/plat-x86_64-linux-gnu', '/home/me/.virtualenvs/default/lib/python2.7/lib-tk', '/home/me/.virtualenvs/default/lib/python2.7/lib-old', '/home/me/.virtualenvs/default/lib/python2.7/lib-dynload', '/usr/lib/python2.7', '/usr/lib/python2.7/plat-x86_64-linux-gnu', '/usr/lib/python2.7/lib-tk', '/home/me/.virtualenvs/default/local/lib/python2.7/site-packages', '/home/me/.virtualenvs/default/lib/python2.7/site-packages']
(myenv)me@me /home/me$ python -c 'import pyspark'
Traceback (most recent call last):
File "<string>", line 1, in <module>
ImportError: No module named pyspark
Run Code Online (Sandbox Code Playgroud)
小智 6
我有同样的问题,但这种 帮助.
只需在.bashrc中添加以下命令即可
export SPARK_HOME=/path/to/your/spark-1.4.1-bin-hadoop2.6
export PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/build:$PYTHONPATH
export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.8.2.1-src.zip:$PYTHONPATH
Run Code Online (Sandbox Code Playgroud)
我认为你混淆了PYTHONPATH和sys.path。PYTHONPATH但你确定安装正确就需要修改吗pyspark?
编辑:
我没有使用过 pyspark,但这有帮助吗? 在 python shell 中导入 pyspark
| 归档时间: |
|
| 查看次数: |
7754 次 |
| 最近记录: |