下面的示例代码摘自《带有Spark的Advanced Analytics》一书。当我将其加载到spark-shell(版本1.4.1)中时,出现以下错误,表明找不到StatCounter:

import org.apache.spark.util.StatCounter
<console>:9: error: not found: type StatCounter
        val stats: StatCounter = new StatCounter()
                   ^
<console>:9: error: not found: type StatCounter
        val stats: StatCounter = new StatCounter()
                                     ^
<console>:23: error: not found: type NAStatCounter
        def apply(x: Double) = new NAStatCounter().add(x)

如果我只是在spark-shell中执行以下操作,就不会有问题:
scala> import org.apache.spark.util.StatCounter
import org.apache.spark.util.StatCounter

scala> val statsCounter: StatCounter = new StatCounter()
statsCounter: org.apache.spark.util.StatCounter = (count: 0, mean: 0.000000, stdev: NaN, max: -Infinity, min: Infinity)

问题似乎出在spark-shell中的:load命令。

这是代码:
import org.apache.spark.util.StatCounter
class NAStatCounter extends Serializable {
    val stats: StatCounter = new StatCounter()
    var missing: Long = 0

    def add(x: Double): NAStatCounter = {
        if (java.lang.Double.isNaN(x)) {
            missing += 1
        } else {
        stats.merge(x)
        }
        this
    }

    def merge(other: NAStatCounter): NAStatCounter = {
        stats.merge(other.stats)
        missing += other.missing
        this
    }

    override def toString = {
        "stats: " + stats.toString + " NaN: " + missing
    }
}

object NAStatCounter extends Serializable {
    def apply(x: Double) = new NAStatCounter().add(x)
}

最佳答案

我跟你有完全一样的问题。
我按照您的尝试解决了
改变

val stats: StatCounter = new StatCounter()

进入
val stats: org.apache.spark.util.StatCounter = new org.apache.spark.util.StatCounter()

原因可能是系统不知道StatCounter的路径

关于scala - 为什么此Spark示例代码无法加载到spark-shell中?,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/34971690/

10-09 06:46
查看更多