Har*_*ngh 2 apache-spark apache-spark-sql
我已经在使用Spark 1.6.1并且现在正在评估Spark 2.0 Preview,但是我找不到org.apache.spark.sql.Row.
这是必需的,因为我正在将我的DataFrame代码迁移到1.6.1到2.0预览中.我在这里错过了一些东西吗?我的maven依赖关系粘贴在下面
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.0.0-preview</version>
<scope>system</scope>
<systemPath>C://spark-2.0.0-preview-bin-hadoop2.7//jars//spark-core_2.11-2.0.0-preview.jar</systemPath>
</dependency>
<dependency>
<groupId>com.oracle</groupId>
<artifactId>ojdbc7</artifactId>
<version>12.1.0.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.0.0-preview</version>
<scope>system</scope>
<systemPath>C://spark-2.0.0-preview-bin-hadoop2.7//jars//spark-sql_2.11-2.0.0-preview.jar</systemPath>
</dependency>
Run Code Online (Sandbox Code Playgroud)
小智 6
在spark v2.0.0中,Row已移至另一个jar文件,将其添加到您的maven依赖项中
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-catalyst_2.11</artifactId>
<version>2.0.0-preview</version>
<scope>system</scope>
<systemPath>C://spark-2.0.0-preview-bin-hadoop2.7//spark-catalyst_2.11-2.0.0-preview.jar</systemPath>
</dependency>
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
4390 次 |
| 最近记录: |