如何找到Spark的安装目录?

Ani*_*nil 15 java ubuntu apache-spark

我想改变spark-env.sh.如何在ubuntu中找到安装目录?

我查看了用户界面,但没有找到任何东西.

whereis spark 
Run Code Online (Sandbox Code Playgroud)

结果: spark:

这是locate命令locate spark的日志

/home/sys6002/.netbeans/8.0.2/apache-tomcat-8.0.15.0_base/temp/spark-ba1ea995-b959-43f4-ab6c-7d9f1ee5fcc1/blockmgr-db3a931b-7f1a-423e-b5da-b75a958a1909/11
/home/sys6002/.netbeans/8.0.2/apache-tomcat-8.0.15.0_base/temp/spark-ba1ea995-b959-43f4-ab6c-7d9f1ee5fcc1/blockmgr-db3a931b-7f1a-423e-b5da-b75a958a1909/13
/home/sys6002/.netbeans/8.0.2/apache-tomcat-8.0.15.0_base/temp/spark-ba1ea995-b959-43f4-ab6c-7d9f1ee5fcc1/httpd-16b4313e-72dc-4665-b4ac-df491869386d/files
/home/sys6002/.netbeans/8.0.2/apache-tomcat-8.0.15.0_base/temp/spark-ba1ea995-b959-43f4-ab6c-7d9f1ee5fcc1/httpd-16b4313e-72dc-4665-b4ac-df491869386d/jars
/home/sys6002/Desktop/diff spark hadoop.png
/home/sys6002/Desktop/sparkmain
/home/sys6002/Downloads/learning-spark-master.zip
/home/sys6002/Downloads/mongo-spark-master
/home/sys6002/Downloads/spark-1.5.1
/home/sys6002/Downloads/spark-1.5.1-bin-hadoop2.6
/home/sys6002/Downloads/spark-1.5.1-bin-hadoop2.6 (2)
/home/sys6002/Downloads/spark-1.5.1-bin-hadoop2.6.tgz
/home/sys6002/Downloads/spark-1.5.1-bin-without-hadoop
/home/sys6002/Downloads/spark-cassandra-connector-master
/home/sys6002/Downloads/spark-core_2.9.3-0.8.0-incubati
home/sys6002/anaconda3/pkgs/odo-0.3.2-np19py34_0/lib/python3.4/site-packages/odo/backends/tests/__pycache__/test_sparksql.cpython-34.pyc
/home/sys6002/spark-example/a.txt
/home/sys6002/spark-example/a.txt~
/home/sys6002/spark-example/pom.xml
/home/sys6002/spark-example/pom.xml~
/home/sys6002/spark-example/src
/home/sys6002/spark-example/src/main
/home/sys6002/spark-example/src/test
/home/sys6002/spark-example/src/main/java
/home/sys6002/spark-example/src/main/java/com
/home/sys6002/spark-example/src/main/java/com/geekcap
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/App.java
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/WordCount.java~
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/sparkexample
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/sparkexample/WordCount.java
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/sparkexample/WordCount.java~

/home/sys6002/spark-example/src/test/java/com/geekcap/javaworld/AppTest.java
/usr/share/app-install/desktop/lightspark:lightspark.desktop
/usr/share/app-install/desktop/sparkleshare:sparkleshare-invite-opener.desktop
/usr/share/app-install/desktop/sparkleshare:sparkleshare.desktop
Run Code Online (Sandbox Code Playgroud)

小智 19

echo 'sc.getConf.get("spark.home")' | spark-shell
Run Code Online (Sandbox Code Playgroud)

片刻之后,您的 Spark 主页将被打印出来,您将看到如下内容:

scala> sc.getConf.get("spark.home")
res0: String = /usr/local/lib/python3.7/site-packages/pyspark
Run Code Online (Sandbox Code Playgroud)

所以在这种情况下,我的 Spark Home 是 /usr/local/lib/python3.7/site-packages/pyspark


Avi*_*mka 12

您可以尝试以下两个命令:

  1. locate spark

  2. whereis spark

locate - 对于每个给定的模式,找到一个或多个文件名数据库,并显示包含该模式的文件名.模式可以包含shell样式的元字符:' ','?'和'[]'.元字符不处理'/'或'.' 特别.因此,模式'foo bar'可以匹配包含'foo3/bar' 的文件名,模式' duck '可以匹配包含'lake/.ducky'的文件名.应引用包含元字符的模式以防止它们被shell扩展.

whereis查找指定文件的源/二进制和手册部分.提供的名称首先被剥离了前导路径名组件和.ext形式的任何(单个)尾随扩展名,例如.c.s的前缀.由于使用源代码控制而导致的问题也得到了解决.然后尝试在标准Linux位置列表中找到所需的程序.

  • `whereis spark` 结果:`spark:` (6认同)

小智 5

这也为我工作:

cd $SPARK_HOME

如果设置了环境变量,它将带您到安装Spark的目录。