本文介绍了在spark中使用Hive上下文时出错:对象Hive不是org.apache.spark.sql包的成员的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

限时删除!!

我正在尝试构造一个从SQLContext继承的Hive上下文.

I am trying to construct a Hive Context ,which inherits from SQLContext.

val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)

我收到以下错误:

error: object hive is not a member of package org.apache.spark.sql
       val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)

从自动补全功能中我可以清楚地看到蜂巢不存在.有关如何解决此问题的任何想法?这是可用的sparkSQL文档中的示例.

I can clearly see from the autocompletion that hive doest not exist.Any ideas on how to resolve this? This is an example fromthe sparkSQL documentation available.

谢谢

推荐答案

由于配置单元的依赖关系,默认情况下它不会编译为spark二进制文件,因此您必须自己构建它.引用网站

Because of hive's dependencies it is not compiled into the spark binary by default you have to build it yourself. Quote from the website

但是,由于Hive具有大量依赖关系,因此它不包含在默认的Spark程序集中.为了使用Hive,您必须首先运行sbt/sbt -Phive assembly/assembly(或将-Phive用于maven).

However, since Hive has a large number of dependencies, it is not included in the default Spark assembly. In order to use Hive you must first run sbt/sbt -Phive assembly/assembly (or use -Phive for maven).

这篇关于在spark中使用Hive上下文时出错:对象Hive不是org.apache.spark.sql包的成员的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

1403页,肝出来的..

09-06 21:53