小编Agu*_*ues的帖子

将PySpark与Jupyter Notebook集成

我正在关注这个网站来安装Jupyter Notebook,PySpark并集成两者.

当我需要创建"Jupyter配置文件"时,我读到"Jupyter配置文件"不再存在.所以我继续执行以下几行.

$ mkdir -p ~/.ipython/kernels/pyspark

$ touch ~/.ipython/kernels/pyspark/kernel.json
Run Code Online (Sandbox Code Playgroud)

我打开kernel.json并写下面的内容:

{
 "display_name": "pySpark",
 "language": "python",
 "argv": [
  "/usr/bin/python",
  "-m",
  "IPython.kernel",
  "-f",
  "{connection_file}"
 ],
 "env": {
  "SPARK_HOME": "/usr/local/Cellar/spark-2.0.0-bin-hadoop2.7",
  "PYTHONPATH": "/usr/local/Cellar/spark-2.0.0-bin-hadoop2.7/python:/usr/local/Cellar/spark-2.0.0-bin-hadoop2.7/python/lib/py4j-0.10.1-src.zip",
  "PYTHONSTARTUP": "/usr/local/Cellar/spark-2.0.0-bin-hadoop2.7/python/pyspark/shell.py",
  "PYSPARK_SUBMIT_ARGS": "pyspark-shell"
 }
}
Run Code Online (Sandbox Code Playgroud)

Spark的路径是正确的.

但是,当我运行时,jupyter console --kernel pyspark我得到这个输出:

MacBook:~ Agus$ jupyter console --kernel pyspark
/usr/bin/python: No module named IPython
Traceback (most recent call last):
  File "/usr/local/bin/jupyter-console", line 11, in <module>
    sys.exit(main())
  File "/usr/local/lib/python2.7/site-packages/jupyter_core/application.py", line 267, in launch_instance
    return super(JupyterApp, cls).launch_instance(argv=argv, **kwargs) …
Run Code Online (Sandbox Code Playgroud)

ipython apache-spark pyspark jupyter jupyter-notebook

2
推荐指数
2
解决办法
4109
查看次数