tri*_*oid 5 apache-spark apache-spark-sql spark-hive
我正在尝试编写一个依赖的单元测试用例DataFrame.saveAsTable()(因为它由文件系统支持).我将hive仓库参数指向本地磁盘位置:
sql.sql(s"SET hive.metastore.warehouse.dir=file:///home/myusername/hive/warehouse")
Run Code Online (Sandbox Code Playgroud)
默认情况下,应启用Metastore的嵌入式模式,因此不需要外部数据库.
但是HiveContext似乎忽略了这个配置:因为我在调用saveAsTable()时仍然遇到这个错误:
MetaException(message:file:/user/hive/warehouse/users is not a directory or unable to create one)
org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:file:/user/hive/warehouse/users is not a directory or unable to create one)
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:619)
at org.apache.spark.sql.hive.HiveMetastoreCatalog.createDataSourceTable(HiveMetastoreCatalog.scala:172)
at org.apache.spark.sql.hive.execution.CreateMetastoreDataSourceAsSelect.run(commands.scala:224)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:54)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:54)
at org.apache.spark.sql.execution.ExecutedCommand.execute(commands.scala:64)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:1099)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:1099)
at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1121)
at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1071)
at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1037)
Run Code Online (Sandbox Code Playgroud)
这很烦人,它为什么还在发生以及如何修复它?
根据http://spark.apache.org/docs/latest/sql-programming-guide.html#sql
请注意,自 Spark 2.0.0 起,hive-site.xml 中的 hive.metastore.warehouse.dir 属性已弃用。相反,使用spark.sql.warehouse.dir指定数据库在仓库中的默认位置。
| 归档时间: |
|
| 查看次数: |
11148 次 |
| 最近记录: |