我正在尝试使用Maven上的scalatest和spark-testing -base进行集成测试Spark.Spark作业读入CSV文件,验证结果,并将数据插入数据库.我试图通过放入已知格式的文件并查看它们是否以及如何失败来测试验证.这个特殊的测试只是确保验证通过.不幸的是,scalatest无法找到我的测试.
相关的插件:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>
<!-- enable scalatest -->
<plugin>
<groupId>org.scalatest</groupId>
<artifactId>scalatest-maven-plugin</artifactId>
<version>1.0</version>
<configuration>
<reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
<wildcardSuites>com.cainc.data.etl.schema.proficiency</wildcardSuites>
</configuration>
<executions>
<execution>
<id>test</id>
<goals>
<goal>test</goal>
</goals>
</execution>
</executions>
</plugin>
Run Code Online (Sandbox Code Playgroud)
这是测试类:
class ProficiencySchemaITest extends FlatSpec with Matchers with SharedSparkContext with BeforeAndAfter {
private var schemaStrategy: SchemaStrategy = _
private var dataReader: DataFrameReader = _
before {
val sqlContext = new SQLContext(sc)
import sqlContext._
import sqlContext.implicits._
val dataInReader = sqlContext.read.format("com.databricks.spark.csv")
.option("header", "true")
.option("nullValue", "")
schemaStrategy = …
Run Code Online (Sandbox Code Playgroud)