Spring (boot) 是否有一种方法可以检查 REST 请求是否包含未被调用的 REST 方法显式声明的参数?
使用required标志,我们可以强制客户端在请求中包含某个参数。我正在寻找一种类似的方法来禁止客户端发送在控制器方法的声明中没有明确提到的参数:
@RequestMapping("/hello")
public String hello(@RequestParam(value = "name") String name) {
//throw an exception if a REST client calls this method and
// sends a parameter with a name other than "name"
//otherwise run this method's logic
}
Run Code Online (Sandbox Code Playgroud)
例如调用
curl "localhost:8080/hello?name=world&city=London"
Run Code Online (Sandbox Code Playgroud)
应该导致4xx答案。
一种选择是明确检查意外参数:
@RequestMapping("/hello")
public String hello(@RequestParam Map<String,String> allRequestParams) {
//throw an exception if allRequestParams contains a key that we cannot process here
//otherwise run this method's logic
}
Run Code Online (Sandbox Code Playgroud)
但是是否也可以在保持与 …
当我将具有定义分区的数据帧作为镶木地板文件写入磁盘,然后再次重新读取镶木地板文件时,分区丢失。有没有办法在写入和重新读取期间保留数据帧的原始分区?
示例代码
//create a dataframe with 100 partitions and print the number of partitions
val originalDf = spark.sparkContext.parallelize(1 to 10000).toDF().repartition(100)
println("partitions before writing to disk: " + originalDf.rdd.partitions.length)
//write the dataframe to a parquet file and count the number of files actually written to disk
originalDf.write.mode(SaveMode.Overwrite).parquet("tmp/testds")
println("files written to disk: " + new File("tmp/testds").list.size)
//re-read the parquet file into a dataframe and print the number of partitions
val readDf = spark.read.parquet("tmp/testds")
println("partitions after reading from disk: " + readDf.rdd.partitions.length)
Run Code Online (Sandbox Code Playgroud)
打印出来 …
是什么区别爆炸和explode_outer?这两个函数的文档是相同的,两个函数的示例也相同:
SELECT explode(array(10, 20));
10
20
Run Code Online (Sandbox Code Playgroud)
和
SELECT explode_outer(array(10, 20));
10
20
Run Code Online (Sandbox Code Playgroud)
的火花源表明,有两个功能之间的差
expression[Explode]("explode"),
expressionGeneratorOuter[Explode]("explode_outer")
Run Code Online (Sandbox Code Playgroud)
但是与expression相比,expressionGeneratorOuter的效果是什么?