如何从命令行检查 Spark 配置?

pyt*_*nic 4 linux hadoop scala apache-spark

基本上,我想通过命令行检查Spark配置的一个属性,例如“spark.local.dir”,即无需编写程序。有没有办法做到这一点?

Nis*_*yal 5

There is no option of viewing the spark configuration properties from command line.

\n\n

Instead you can check it in spark-default.conf file. Another option is to view from webUI.

\n\n

The application web UI at http://driverIP:4040 lists Spark properties in the \xe2\x80\x9cEnvironment\xe2\x80\x9d tab. Only values explicitly specified through spark-defaults.conf, SparkConf, or the command line will appear. For all other configuration properties, you can assume the default value is used.

\n\n

更多详细信息可以参考Spark配置

\n


vaq*_*han 5

以下命令在控制台上打印您的conf属性

 sc.getConf.toDebugString
Run Code Online (Sandbox Code Playgroud)