在 Java 类中找不到 DataFrame Spark

Dan*_*nny 3 java dataframe apache-spark apache-spark-sql

我正在使用 Spark 编写 Java 类。我有这个错误:“DataFrame 无法解析为类型”和有关导入的错误:“无法解析导入 org.apache.spark.sql.DataFrame”。这是类导入:

import org.apache.spark.api.java.*;
import org.apache.spark.api.java.function.Function;
import org.apache.spark.sql.DataFrameReader;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SQLContext;

import org.apache.spark.sql.DataFrame;
Run Code Online (Sandbox Code Playgroud)

这是文件 pom.xml:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>SparkBD</groupId>
    <artifactId>SparkProject</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <dependencies>
        <dependency> <!-- Spark dependency -->
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.3.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.11</artifactId>
            <version>2.3.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>2.3.0</version>
        </dependency>
    </dependencies>
</project>
Run Code Online (Sandbox Code Playgroud)

小智 7

DataFrame已在 Spark 2.0 中的 Java API 中删除(在 Scala API 中它只是一个别名)。您应该将其替换为Dataset<Row>.

  • 只保留 import org.apache.spark.sql.Dataset
  • 无论你在哪里DataFrame使用Dataset<Row>