问题描述
我遇到错误SQLContext.gerorCreate不是对象org.apache.spark.SQLContext的值.这是我的代码
I am getting error SQLContext.gerorCreate is not a value of object org.apache.spark.SQLContext. This is my code
import org.apache.spark.SparkConf
import org.apache.spark.streaming.StreamingContext
import org.apache.spark.streaming.Seconds
import org.apache.spark.streaming.kafka.KafkaUtils
import org.apache.spark.sql.functions
import org.apache.spark.sql.SQLContext
import org.apache.spark.sql.types
import org.apache.spark.SparkContext
import java.io.Serializable
case class Sensor(id:String,date:String,temp:String,press:String)
object consum {
def main(args: Array[String]) {
val sparkConf = new SparkConf().setAppName("KafkaWordCount").setMaster("local[2]")
val ssc = new StreamingContext(sparkConf, Seconds(2))
val sc=new SparkContext(sparkConf)
val lines = KafkaUtils.createStream(ssc, "localhost:2181", "spark-streaming-consumer-group", Map("hello" -> 5))
def parseSensor(str:String): Sensor={
val p=str.split(",")
Sensor(p(0),p(1),p(2),p(3))
}
val data=lines.map(_._2).map(parseSensor)
val sqlcontext=new SQLContext(sc)
import sqlcontext.implicits._
data.foreachRDD { rdd=>
val sensedata=sqlcontext.getOrCreate(rdd.sparkContext)
}
我也尝试过使用SQLContext.getOrCreate,但存在相同的错误.
I have tried with SQLContext.getOrCreate as well but same error.
推荐答案
没有为SparkContext
或SQLContext
定义此类getOrCreate
函数.
getOrCreate
函数是为从中创建SparkSession
实例的SparkSession
实例定义的.然后,我们从使用getOrCreate
方法调用创建的SparkSession
实例中获取sparkContext
实例或sqlContext
实例.
getOrCreate
function is defined for SparkSession
instances from which SparkSession
instances are created. And we get sparkContext
instance or sqlContext
instance from the SparkSession
instance created using getOrCreate
method call.
我希望解释清楚.
已更新
我上面所做的解释适用于更高版本的spark.在博客中在OP引用时,作者使用的是spark 1.6和 1.6.3的api文档明确指出
The explanation I did above is suitable for higher versions of spark. In the blog as the OP is referencing, the author is using spark 1.6 and the api doc of 1.6.3 clearly states
这篇关于SQLContext.gerorCreate不是一个值的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!