我们在ScalaTest中使用Spec trait进行测试.当我们运行整个套件时,它并不总是以相同的顺序运行.谷歌的大多数答案建议定义一个套件并指定所有测试名称.但这需要我们在每次添加新测试时添加测试名称.
是否可以使用DiscoverySuite本身并定义测试执行顺序?就像按字母顺序运行测试一样.我考虑扩展DiscoverySuite,但DiscoverySuite似乎是私有的scalatest.
- -更多信息 - -
通过命令我的意思是,如果有测试A,B,C.
class A extends Spec {..}
class B extends Spec {..}
class C extends Spec {..}
然后我希望测试按顺序(A,B,C)运行.但现在发生的是,它每次都以不同的顺序运行.
我有一个混合的Java/Scala项目,包括JUnit和ScalaTest测试.使用scalatest插件,Gradle运行ScalaTest测试src/test/scala,但忽略了JUnit测试src/test/java.没有插件,Gradle运行JUnit测试但忽略Scala.我错过了什么伎俩?
我的build.gradle:
plugins {
id 'java'
id 'maven'
id 'scala'
id "com.github.maiflai.scalatest" version "0.6-5-g9065d91"
}
sourceCompatibility = 1.8
group = 'org.chrononaut'
version = '1.0-SNAPSHOT'
task wrapper(type: Wrapper) {
gradleVersion = '2.3'
}
ext {
scalaMajorVersion = '2.11'
scalaVersion = "${scalaMajorVersion}.5"
}
repositories {
mavenCentral()
mavenLocal()
}
dependencies {
compile "org.scala-lang:scala-library:${scalaVersion}"
compile "org.scala-lang.modules:scala-xml_${scalaMajorVersion}:1.0.3"
compile 'com.google.guava:guava:18.0'
compile 'javax.xml.bind:jaxb-api:2.2.12'
compile 'jaxen:jaxen:1.1.6'
compile 'joda-time:joda-time:2.7'
compile 'org.joda:joda-convert:1.7'
compile 'org.apache.commons:commons-lang3:3.3.2'
compile 'org.jdom:jdom2:2.0.5'
testCompile 'junit:junit:4.12'
testCompile 'org.easytesting:fest-assert:1.4'
testCompile 'org.mockito:mockito-core:1.10.19' …Run Code Online (Sandbox Code Playgroud) 当我尝试在带有Scala 2.13的IntelliJ(2019.1),Scala IntelliJ插件v2019.1.8中运行测试时遇到以下错误:
Exception in thread "ScalaTest-dispatcher" java.lang.NoSuchMethodError: scala.collection.JavaConverters.seqAsJavaListConverter(Lscala/collection/Seq;)Lscala/collection/convert/Decorators$AsJava;
at org.jetbrains.plugins.scala.testingSupport.scalaTest.treeBuilder.ParallelTreeBuilder.getOrdinalList(ParallelTreeBuilder.java:21)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.treeBuilder.ParallelTreeBuilder$SuiteTree.<init>(ParallelTreeBuilder.java:92)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.treeBuilder.ParallelTreeBuilder.initRun(ParallelTreeBuilder.java:261)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestReporterWithLocation.apply(ScalaTestReporterWithLocation.java:59)
at org.scalatest.DispatchReporter$Propagator.$anonfun$run$10(DispatchReporter.scala:249)
at org.scalatest.DispatchReporter$Propagator.$anonfun$run$10$adapted(DispatchReporter.scala:248)
at scala.collection.immutable.List.foreach(List.scala:312)
at org.scalatest.DispatchReporter$Propagator.run(DispatchReporter.scala:248)
at java.lang.Thread.run(Thread.java:745)
Run Code Online (Sandbox Code Playgroud)
以下是我的Gradle依赖项:
dependencies {
implementation 'org.scala-lang:scala-library:2.13.0'
testImplementation 'org.scalatic:scalatic_2.13:3.0.8'
testImplementation 'org.scalatest:scalatest_2.13:3.0.8'
}
Run Code Online (Sandbox Code Playgroud)
当我将依存关系更改为Scala 2.12.x时,测试将在IntelliJ中正确执行而没有错误。这里发生了什么?
我从sbt运行我的scalatest,输出混合起来 - scalatest打印所有测试运行,并对它们发表评论,并在中间某处打印统计信息:
> test
[info] Compiling 1 Scala source to /home/platon/Tor/scala-dojo-02/target/scala-2.9.1/classes...
[info] FunsWithListsTests:
[info] - should return list of labels
[info] - should return the average rating of games belonging to Zenga
[info] - should return the total ratings of all games
[info] - should return the total ratings of EA games *** FAILED ***
[info] 0 did not equal 170 (FunsWithListsTests.scala:35)
[error] Failed: : Total 8, Failed 5, Errors 0, Passed 3, Skipped 0
[info] - should increase …Run Code Online (Sandbox Code Playgroud) 如何使用其maven插件将命令行参数传递给ScalaTest?我正在寻找类似TestNG delegateCommandSystemProperties配置的东西,但我在ScalaTest文档中找到的最接近的是:
argLine:选项以指定要传递给分叉进程的其他JVM选项environmentVariables:要传递给分叉进程的其他环境变量systemProperties:要传递给分叉进程的其他系统属性但这不是多余的吗?例如,如果我想传递environment=development,我需要在以下内容中指定以下内容pom.xml:
<plugin>
<groupId>org.scalatest</groupId>
<artifactId>scalatest-maven-plugin</artifactId>
<configuration>
<argLine>-Denvironment=${env}</argLine>
</configuration>
</plugin>
Run Code Online (Sandbox Code Playgroud)
然后跑mvn test -Denv=development.是否有更简单的方法将命令行参数直接传递给ScalaTest?
scalatest和spock有何不同?每个的附加价值是多少?哪个更适合行为驱动开发(BDD)?请问您能就此事分享一些想法吗?
我想开始BDD,我想在两者之间选择一个,因此我想做出一个有根据的决定.因此,我首先得到最多的信息,特别是考虑到我是一名java程序员,scala似乎有一个重要的学习曲线.
任何建议或想法或经验回报都是受欢迎的.
非常感谢
我正在使用ScalaTest运行一些测试,这些测试依赖于连接来测试服务器是否存在.我目前创建了自己的Spec,类似于:
abstract class ServerDependingSpec extends FlatSpec with Matchers {
def serverIsAvailable: Boolean = {
// Check if the server is available
}
}
Run Code Online (Sandbox Code Playgroud)
当此方法返回时,是否可以忽略(但不会失败)测试false?
目前我是以"hackish"方式做到的:
"Something" should "do something" in {
if(serverIsAvailable) {
// my test code
}
}
Run Code Online (Sandbox Code Playgroud)
但我想要类似的东西
whenServerAvailable "Something" should "do something" in {
// test code
}
Run Code Online (Sandbox Code Playgroud)
要么
"Something" should "do something" whenServerAvailable {
// test code
}
Run Code Online (Sandbox Code Playgroud)
我想我应该定义我的自定义标签,但我只能参考in或者的源代码,ignore我不明白我应该如何插入我的自定义实现.
我该怎么做到这一点?
我有这个代码100%来自sbt,执行sbt test但在Intellij Idea中抛出一个编译错误.
import org.scalatest.{BeforeAndAfter, FunSuite, GivenWhenThen}
class SimpleTest extends FunSuite with GivenWhenThen with BeforeAndAfter {
test("Simple Test") {
Given("Why this error?")
assert("ok" === "ok")
}
}
Run Code Online (Sandbox Code Playgroud)
错误是:
Error:(5, 10) could not find implicit value for parameter pos: org.scalactic.source.Position
Given("Why this error?")
Error:(5, 10) not enough arguments for method Given: (implicit pos: org.scalactic.source.Position)Unit.
Unspecified value parameter pos.
Given("Why this error?")
Error:(6, 11) could not find implicit value for parameter prettifier: org.scalactic.Prettifier
assert("ok" === "ok")
Error:(6, 11) …Run Code Online (Sandbox Code Playgroud) 我正在使用ScalaTest为Spark编写测试用例.
import org.apache.spark.sql.SparkSession
import org.scalatest.{BeforeAndAfterAll, FlatSpec}
class ClassNameSpec extends FlatSpec with BeforeAndAfterAll {
var spark: SparkSession = _
var className: ClassName = _
override def beforeAll(): Unit = {
spark = SparkSession.builder().master("local").appName("class-name-test").getOrCreate()
className = new ClassName(spark)
}
it should "return data" in {
import spark.implicits._
val result = className.getData(input)
assert(result.count() == 3)
}
override def afterAll(): Unit = {
spark.stop()
}
}
Run Code Online (Sandbox Code Playgroud)
当我尝试编译测试套件时,它会给我以下错误:
stable identifier required, but ClassNameSpec.this.spark.implicits found.
[error] import spark.implicits._
[error] ^
[error] one error found
[error] (test:compileIncremental) …Run Code Online (Sandbox Code Playgroud)