Mic*_*rce 2 apache-spark spark-submit
在Spark 2.0中。运行spark提交时如何设置spark.yarn.executor.memoryOverhead。
我知道可以设置spark.executor.cores之类的东西--executor-cores 2。该属性是否具有相同的模式?例如--yarn-executor-memoryOverhead 4096
小智 6
请查找示例。这些值也可以在Sparkconf中给出。
例:
./bin/spark-submit \
--[your class] \
--master yarn \
--deploy-mode cluster \
--num-exectors 17
--conf spark.yarn.executor.memoryOverhead=4096 \
--executor-memory 35G \ //Amount of memory to use per executor process
--conf spark.yarn.driver.memoryOverhead=4096 \
--driver-memory 35G \ //Amount of memory to be used for the driver process
--executor-cores 5
--driver-cores 5 \ //number of cores to use for the driver process
--conf spark.default.parallelism=170
/path/to/examples.jar
Run Code Online (Sandbox Code Playgroud)
spark.yarn.executor.memoryOverhead 现在已被弃用:
警告 spark.SparkConf:配置键“spark.yarn.executor.memoryOverhead”已从 Spark 2.3 开始弃用,将来可能会被删除。请改用新密钥“spark.executor.memoryOverhead”。
您可以spark.executor.memoryOverhead通过将其作为配置传递来以编程方式设置:
spark = (
SparkSession.builder
.master('yarn')
.appName('StackOverflow')
.config('spark.driver.memory', '35g')
.config('spark.executor.cores', 5)
.config('spark.executor.memory', '35g')
.config('spark.dynamicAllocation.enabled', True)
.config('spark.dynamicAllocation.maxExecutors', 25)
.config('spark.yarn.executor.memoryOverhead', '4096')
.getOrCreate()
)
sc = spark.sparkContext
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
6115 次 |
| 最近记录: |