小编Che*_* Lv的帖子

如何在Spark Streaming Kafka Consumer中修复"java.io.NotSerializableException:org.apache.kafka.clients.consumer.ConsumerRecord"?

  • Spark 2.0.0
  • Apache Kafka 0.10.1.0
  • 斯卡拉2.11.8

当我使用带有以下Scala代码的kafka broker版本0.10.1.0的spark streaming和kafka集成时,它失败并出现以下异常:

16/11/13 12:55:20 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.io.NotSerializableException: org.apache.kafka.clients.consumer.ConsumerRecord
Serialization stack:
    - object not serializable (class: org.apache.kafka.clients.consumer.ConsumerRecord, value: ConsumerRecord(topic = local1, partition = 0, offset = 10000, CreateTime = 1479012919187, checksum = 1713832959, serialized key size = -1, serialized value size = 1, key = null, value = a))
    - element of array (index: 0)
    - array (class [Lorg.apache.kafka.clients.consumer.ConsumerRecord;, size 11)
    at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40) …
Run Code Online (Sandbox Code Playgroud)

serialization apache-kafka apache-spark spark-streaming

12
推荐指数
1
解决办法
8524
查看次数