Spark报错 driver did not authorize commit

来源:互联网 发布:java opencv 二值化 编辑:程序博客网 时间:2024/06/02 11:49

启动Spark Speculative后,有时候运行任务会发现如下提示:

WARN TaskSetManager: Lost task 55.0 in stage 15.0 (TID 20815, spark047216)org.apache.spark.executor.CommitDeniedException: attempt_201604191557_0015_m_000055_0: Not committed because the driver did not authorize commit

启动 Speculative 后,运行较慢的task会在其他executor上同时再启动一个相同的task,如果其中一个task执行完毕,相同的另一个task就会被禁止提交。因此产生了这个WARN。

这个WARN是因为task提交commit被driver拒绝引发,这个错误不会被统计在stage的failure中,这样做的目的是防止你看到一些具有欺骗性的提示。

相关源码

org.apache.spark.executor

case cDE: CommitDeniedException =>            val reason = cDE.toTaskEndReason          execBackend.statusUpdate(taskId, TaskState.FAILED, ser.serialize(reason))

org.apache.spark.executor.CommitDeniedException

private[spark] class CommitDeniedException(    msg: String,    jobID: Int,    splitID: Int,    attemptNumber: Int)  extends Exception(msg) {  def toTaskEndReason: TaskEndReason = TaskCommitDenied(jobID, splitID, attemptNumber)}

org.apache.spark.TaskCommitDenied

case class TaskCommitDenied(    jobID: Int,    partitionID: Int,    attemptNumber: Int) extends TaskFailedReason {  override def toErrorString: String = s"TaskCommitDenied (Driver denied task commit)" +    s" for job: $jobID, partition: $partitionID, attemptNumber: $attemptNumber"  override def countTowardsTaskFailures: Boolean = false}
0 0
原创粉丝点击