Sub*_*der 16 java hive apache-spark apache-spark-sql
我正在尝试运行此代码时遇到此错误.
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SparkSession;
public class App
{
public static void main(String[] args) throws Exception {
String warehouseLocation = "file:" + System.getProperty("user.dir") + "spark-warehouse";
SparkSession spark = SparkSession
.builder().master("local")
.appName("Java Spark Hive Example")
.config("spark.sql.warehouse.dir", warehouseLocation).enableHiveSupport()
.getOrCreate();
String path = "/home/cloudera/Downloads/NetBeansProjects/sparksql1/src/test/Employee.json";
spark.sql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)");
spark.sql("LOAD DATA LOCAL INPATH '"+path+"' INTO TABLE src");
//load from HDFS
Dataset<Row> df = spark.read().json(path);
df.registerTempTable("temp_table");
spark.sql("create table TEST.employee as select * from temp_table");
df.printSchema();
df.show();
}
}
Run Code Online (Sandbox Code Playgroud)
输出:
线程"main"中的异常java.lang.IllegalArgumentException:无法使用Hive支持实例化SparkSession,因为找不到Hive类.在com.training.hivetest.App.main的org.apache.spark.sql.SparkSession $ Builder.enableHiveSupport(SparkSession.scala:778)(App.java:21)
怎么解决?
aba*_*hel 27
将以下依赖项添加到您的maven项目.
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>2.0.0</version>
</dependency>
Run Code Online (Sandbox Code Playgroud)
归档时间: |
|
查看次数: |
20775 次 |
最近记录: |