jav*_*y79 0 scala user-defined-functions apache-spark apache-spark-sql
给出以下示例:
import org.apache.spark.sql.expressions.UserDefinedFunction
import org.apache.spark.sql.functions._
val testUdf: UserDefinedFunction = udf((a: String, b: String, c: Int) => {
val out = s"test1: $a $b $c"
println(out)
out
})
val testUdf2: UserDefinedFunction = udf((a: String, b: String, c: String) => {
val out = s"test2: $a $b $c"
println(out)
out
})
Seq(("hello", "world", null))
.toDF("a", "b", "c")
.withColumn("c", $"c" cast "Int")
.withColumn("test1", testUdf($"a", $"b", $"c"))
.withColumn("test2", testUdf2($"a", $"b", $"c"))
.show
Run Code Online (Sandbox Code Playgroud)
testUdf似乎没有被调用。没有错误,没有警告,它只是返回 null。
有没有办法检测这些静默故障?另外,这里发生了什么?
火花 2.4.4 斯卡拉 2.11