Spark Scala UDF 参数限制为 10

RaA*_*aAm 2 scala user-defined-functions apache-spark apache-spark-sql

我需要创建一个具有 11 个参数的 Spark UDF。有什么办法可以实现吗?
我知道我们可以创建一个最多有 10 个参数的 UDF

下面是 10 个参数的代码。有用

val testFunc1 = (one: String, two: String, three: String, four: String,
                 five: String, six: String, seven: String, eight: String, nine: String, ten: String) => {
    if (isEmpty(four)) false
    else four match {
        case "RDIS" => three == "ST"
        case "TTSC" => nine == "UT" && eight == "RR"
        case _ => false
    }
}
import org.apache.spark.sql.functions.udf    
udf(testFunc1)
Run Code Online (Sandbox Code Playgroud)

下面是 11 个参数的代码。面临“未指定值参数:dataType”问题

val testFunc2 = (one: String, two: String, three: String, four: String,
                 five: String, six: String, seven: String, eight: String, nine: String, ten: String, ELEVEN: String) => {
  if (isEmpty(four)) false
  else four match {
    case "RDIS" => three == "ST"
    case "TTSC" => nine == "UT" && eight == "RR" && ELEVEN == "OR"
    case _ => false
  }
}
import org.apache.spark.sql.functions.udf    
udf(testFunc2) // compilation error
Run Code Online (Sandbox Code Playgroud)

Rap*_*oth 5

我建议将参数打包在Map

import org.apache.spark.sql.functions._

val df = sc.parallelize(Seq(("a","b"),("c","d"),("e","f"))).toDF("one","two")


val myUDF = udf((input:Map[String,String]) => {
  // do something with the input
  input("one")=="a"
})

df
  .withColumn("udf_args",map(
    lit("one"),$"one",
    lit("two"),$"one"
  )
 )
 .withColumn("udf_result", myUDF($"udf_args"))
 .show()

+---+---+--------------------+----------+
|one|two|            udf_args|udf_result|
+---+---+--------------------+----------+
|  a|  b|Map(one -> a, two...|      true|
|  c|  d|Map(one -> c, two...|     false|
|  e|  f|Map(one -> e, two...|     false|
+---+---+--------------------+----------+
Run Code Online (Sandbox Code Playgroud)