Cur*_*ong 4 linux scala apache-spark
我一直在按照本教程为scala安装spark:https: //www.tutorialspoint.com/apache_spark/apache_spark_installation.htm
但是,当我尝试运行时,spark-shell我在控制台中收到此错误.
/usr/local/spark/bin/spark-shell: line 57: /usr/local/spark/bin/bin/spark-submit: No such file or directory
Run Code Online (Sandbox Code Playgroud)
我的bashrc看起来像这样:
export PATH = $PATH:/usr/local/spark/bin
export SCALA_HOME=/usr/local/scala/bin
export PYTHONPATH=$SPARK_HOME/python
Run Code Online (Sandbox Code Playgroud)
我错了什么?我以前为python安装了spark,但现在我正在尝试使用scala.火花会让变量混乱吗?谢谢.
你bin在搜索的路径中有太多的东西:
/usr/local/spark/bin/bin/spark-submit
Run Code Online (Sandbox Code Playgroud)
应该
/usr/local/spark/bin/spark-submit
Run Code Online (Sandbox Code Playgroud)
本SPARK_HOME应该是/usr/local/spark/在你的情况,不是/usr/local/spark/bin/因为它似乎是现在的情况.
根据@Wilmerton 的回答,我在我的~/.bashrc.
# Apache Spark stuff
export JAVA_HOME=/usr/lib/jvm/default-java/jre
export SPARK_HOME=/usr/lib/spark
export SCALA_HOME=/usr/local/scala/bin
export PATH=$PATH:${SPARK_HOME}/bin
export PATH=$PATH:$SCALA_HOME
Run Code Online (Sandbox Code Playgroud)
(我安装default-jdk与apt-get install default-jdk并aptitude search jdk从产生不同以下条目p状态:
i default-jdk - Standard Java or Java compatible Development Kit
i A default-jdk-headless - Standard Java or Java compatible Development Kit (headless)
i A openjdk-8-jdk - OpenJDK Development Kit (JDK)
i A openjdk-8-jdk-headless - OpenJDK Development Kit (JDK) (headless)
iBA openjdk-8-jre - OpenJDK Java runtime, using Hotspot JIT
i A openjdk-8-jre-headless - OpenJDK Java runtime, using Hotspot JIT (headless)
i openjdk-9-jdk-headless - OpenJDK Development Kit (JDK) (headless)
iB openjdk-9-jre - OpenJDK Java runtime, using Hotspot JIT
i A openjdk-9-jre-headless - OpenJDK Java runtime, using Hotspot JIT (headless)
Run Code Online (Sandbox Code Playgroud)
)