poi*_*rez 13
I believe failed tasks are resubmitted because I have seen the same failed task submitted multiple times on the Web UI. However, if the same task fails multiple times, the full job fail:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 120 in stage 91.0 failed 4 times, most recent failure: Lost task 120.3 in stage 91.0
Run Code Online (Sandbox Code Playgroud)
小智 8
是的,但是为最大失败次数设置了一个参数
spark.task.maxFailures 4 Number of individual task failures before giving up on the job. Should be greater than or equal to 1. Number of allowed retries = this value - 1.
Run Code Online (Sandbox Code Playgroud)
归档时间: |
|
查看次数: |
8316 次 |
最近记录: |