cls*_*udt 6 python pip apache-spark pyspark
通过全新安装后pyspark, pip install pyspark我收到以下错误:
> pyspark
Could not find valid SPARK_HOME while searching ['/Users', '/usr/local/bin']
/usr/local/bin/pyspark: line 24: /bin/load-spark-env.sh: No such file or directory
/usr/local/bin/pyspark: line 77: /bin/spark-submit: No such file or directory
/usr/local/bin/pyspark: line 77: exec: /bin/spark-submit: cannot execute: No such file or directory
> spark-shell
Could not find valid SPARK_HOME while searching ['/Users', '/usr/local/bin']
/usr/local/bin/spark-shell: line 57: /bin/spark-submit: No such file or directory
Run Code Online (Sandbox Code Playgroud)
什么是有效的SPARK_HOME,我如何设置它,为什么没有有效的默认值?
我已经看到有关如何在手动安装 spark 后手动设置环境变量的说明,但我想知道在使用pipto install后如何在这种情况下设置它pyspark。
我只安装了 spark via brew install apache-spark,并且spark-shell从该安装中出来的东西开箱即用。之后安装 pyspark 后,我收到了上面的消息。令人困惑。
> brew install apache-spark
Run Code Online (Sandbox Code Playgroud)
实际上已经提供了一个工作pyspark外壳。没有必要额外
> pip install pyspark
Run Code Online (Sandbox Code Playgroud)
事实上,这破坏了我的安装。