我无法访问包中的SparkConf。但是我已经导入了import org.apache.spark.SparkConf。我的代码是:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.rdd.RDD

import org.apache.spark._
import org.apache.spark.streaming._
import org.apache.spark.streaming.StreamingContext._

object SparkStreaming {
    def main(arg: Array[String]) = {

        val conf = new SparkConf.setMaster("local[2]").setAppName("NetworkWordCount")
        val ssc = new StreamingContext( conf, Seconds(1) )

        val lines = ssc.socketTextStream("localhost", 9999)
        val words = lines.flatMap(_.split(" "))
        val pairs_new = words.map( w => (w, 1) )
        val wordsCount = pairs_new.reduceByKey(_ + _)
        wordsCount.print()

        ssc.start() // Start the computation
        ssc.awaitTermination() // Wait for the computation to the terminate

    }
}
sbt依赖项是:
name := "Spark Streaming"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % "1.5.2" % "provided",
    "org.apache.spark" %% "spark-mllib" % "1.5.2",
    "org.apache.spark" %% "spark-streaming" % "1.5.2"
)

但是错误显示无法访问SparkConf
[error] /home/cliu/Documents/github/Spark-Streaming/src/main/scala/Spark-Streaming.scala:31: object SparkConf in package spark cannot be accessed in package org.apache.spark
[error]         val conf = new SparkConf.setMaster("local[2]").setAppName("NetworkWordCount")
[error]                        ^

最佳答案

如果在SparkConf之后添加括号,则会进行编译:
val conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount")
关键是SparkConf是一个类而不是一个函数,因此您也可以将类名用于作用域。因此,当您在类名称之后添加括号时,您要确保要调用的是类构造函数,而不是作用域功能。这是来自Scala shell的示例,说明了不同之处:

scala> class C1 { var age = 0; def setAge(a:Int) = {age = a}}
defined class C1

scala> new C1
res18: C1 = $iwC$$iwC$C1@2d33c200

scala> new C1()
res19: C1 = $iwC$$iwC$C1@30822879

scala> new C1.setAge(30)  // this doesn't work

<console>:23: error: not found: value C1
          new C1.setAge(30)
              ^

scala> new C1().setAge(30) // this works

scala>

关于scala - 为什么Scala编译器的 “object SparkConf in package spark cannot be accessed in package org.apache.spark”失败?,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/34108613/

10-12 23:45