小编Ale*_*nko的帖子

Cats中的效果抽象和并行执行

我有下一个使用Cats IO编写的代码,它并行执行多个动作(简化):

import cats.effect._
import cats.implicits._

import scala.concurrent.ExecutionContext.Implicits.global

    class ParallelExecIO {

        def exec: IO[List[String]] = {
            val foo = IO.shift *> IO("foo")
            val bar = IO.shift *> IO("bar")
            List(foo, bar).parSequence
        }
    }
Run Code Online (Sandbox Code Playgroud)

是否可以使用效果抽象重写此代码?应提供哪些类型的证据?

样品:

class ParallelExecIO[F[_]: ConcurrentEffect /* ??? */] {

    def exec: F[List[String]] = {
        val foo = Async.shift[F](implicitly) *> "foo".pure[F]
        val bar = Async.shift[F](implicitly) *> "bar".pure[F]
        List(foo, bar).parSequence 
    }
}
Run Code Online (Sandbox Code Playgroud)

[error] value parSequence不是List [F [String]]的成员

functional-programming scala scala-cats

6
推荐指数
1
解决办法
887
查看次数

Handling PATCH requests with Akka HTTP and circe for nullable fields

Is there a common approach to handle PATCH requests in REST API using circe library? By default, circe does not allow decoding partial JSON with only a part of the fields specified, i.e. it requires all fields to be set. You could use a withDefaults config, but it will be impossible to know if the field you received is null or just not specified. Here is a simplified sample of the possible solution. It uses Left[Unit] as a value to …

scala akka-http circe

5
推荐指数
1
解决办法
420
查看次数

使用 Apache Spark SQL 查找间隔最长且无事故的设施

我有下一个数据集:

|facility|date      |accidents|
| foo    |2019-01-01|1        |
| foo    |2019-01-02|null     |
| foo    |2019-01-03|null     |
| foo    |2019-01-04|2        |
| bar    |2019-01-01|1        |
| bar    |2019-01-02|null     |
| bar    |2019-01-03|3        |
Run Code Online (Sandbox Code Playgroud)

目标是找到一个无事故连续时间最长的设施:

|facility|startDate |interval|
|foo     |2019-01-02|2       |
Run Code Online (Sandbox Code Playgroud)

是否可以使用 Spark SQL 做到这一点?谢谢

PS代码示例:

case class FacilityRecord(name: String, date: java.sql.Date, accidents: Option[Int])
case class IntervalWithoutAccidents(name: String, startDate: java.sql.Date, interval: Int)

implicit val spark: SparkSession = SparkSession.builder
      .appName("Test")
      .master("local")
      .getOrCreate()

import spark.implicits._

val facilityRecords = Seq(
  FacilityRecord("foo", Date.valueOf("2019-01-01"), Some(1)),
  FacilityRecord("foo", Date.valueOf("2019-01-02"), None),
  FacilityRecord("foo", …
Run Code Online (Sandbox Code Playgroud)

scala apache-spark apache-spark-sql

1
推荐指数
1
解决办法
112
查看次数