我的任务完成了,我得到了预期的 RDD 计数结果。我正在运行一个交互式 PySpark shell。我试图理解这个警告是什么意思:



从 Spark 的内部 code 我发现了这个:

    // If this is the last stage with pending tasks, mark the scheduler queue as empty
    // This is needed in case the stage is aborted for any reason
    if (stageIdToNumTasks.isEmpty) {
      allocationManager.onSchedulerQueueEmpty()
      if (numRunningTasks != 0) {
        logWarning("No stages are running, but numRunningTasks != 0")
        numRunningTasks = 0
      }
    }

有人可以解释一下吗?

我说的是 ID 为 0 的任务。

apache-spark - 没有阶段正在运行,但 numRunningTasks != 0-LMLPHP

我可以报告使用 Spark 的 MLlib 和 KMeans() 体验这种行为,其中 the one of the two samples 据说可以用更少的任务完成。我不确定这项工作是否会失败。
2  takeSample at KMeans.scala:355 2016/08/27 21:39:04   7 s 1/1 9600/9600
1  takeSample at KMeans.scala:355 2016/08/27 21:38:57   6 s 1/1 6608/9600

输入集是 100m 点,256 个维度。

PySpark的一些参数:master是yarn,mode是cluster,
spark.dynamicAllocation.enabled             false
# Better serializer - https://spark.apache.org/docs/latest/tuning.html#data-serialization
spark.serializer                            org.apache.spark.serializer.KryoSerializer
spark.kryoserializer.buffer.max             2000m

# Bigger PermGen space, use 4 byte pointers (since we have < 32GB of memory)
spark.executor.extraJavaOptions             -XX:MaxPermSize=512m -XX:+UseCompressedOops

# More memory overhead
spark.yarn.executor.memoryOverhead          4096
spark.yarn.driver.memoryOverhead            8192

spark.executor.cores                        8
spark.executor.memory                       8G

spark.driver.cores                          8
spark.driver.memory                         8G
spark.driver.maxResultSize                  4G

最佳答案

我们得到的是这个代码:

    ...
    // If this is the last stage with pending tasks, mark the scheduler queue as empty
    // This is needed in case the stage is aborted for any reason
    if (stageIdToNumTasks.isEmpty) {
      allocationManager.onSchedulerQueueEmpty()
      if (numRunningTasks != 0) {
        logWarning("No stages are running, but numRunningTasks != 0")
        numRunningTasks = 0
      }
    }
  }
}

来自 Spark 的 GitHub ,评论是迄今为止最好的解释。

关于apache-spark - 没有阶段正在运行,但 numRunningTasks != 0,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/39176520/

10-12 22:54