问题描述
我正在尝试编写依赖于DataFrame.saveAsTable()
的单元测试用例(因为它由文件系统支持).我将配置单元仓库参数指向本地磁盘位置:
I'm trying to write a unit test case that relies on DataFrame.saveAsTable()
(since it is backed by a file system). I point the hive warehouse parameter to a local disk location:
sql.sql(s"SET hive.metastore.warehouse.dir=file:///home/myusername/hive/warehouse")
默认情况下,应启用metastore的嵌入式模式,因此不需要外部数据库.
By default, Embedded Mode of metastore should be enabled, thus doesn't require an external database.
但是HiveContext似乎忽略了此配置:因为调用saveAsTable()时仍然出现此错误:
But HiveContext seems to be ignoring this configuration: since I still get this error when calling saveAsTable():
MetaException(message:file:/user/hive/warehouse/users is not a directory or unable to create one)
org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:file:/user/hive/warehouse/users is not a directory or unable to create one)
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:619)
at org.apache.spark.sql.hive.HiveMetastoreCatalog.createDataSourceTable(HiveMetastoreCatalog.scala:172)
at org.apache.spark.sql.hive.execution.CreateMetastoreDataSourceAsSelect.run(commands.scala:224)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:54)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:54)
at org.apache.spark.sql.execution.ExecutedCommand.execute(commands.scala:64)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:1099)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:1099)
at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1121)
at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1071)
at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1037)
这很烦人,为什么它仍在发生以及如何解决?
This is quite annoying, why is it still happening and how to fix it?
推荐答案
根据 http://spark.apache.org/docs/latest/sql-programming-guide.html#sql
这篇关于如何在HiveContext中设置hive.metastore.warehouse.dir?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!