是什么导致Spark Kafka Connector中的“未知的解析器为null”?

vzg*_*gfu 5 java apache-kafka apache-spark spark-streaming spark-submit

我是Spark的新手,我在本地启动了Zookeeper,kafka(0.10.1.1),同时也启动了具有一名管理员和2名工人的独立(2.2.0)。我的本地scal版本是2.12.3

我能够在spark上运行wordcount,并使用kafka控制台生产者和使用者从kafka主题发布/订阅消息。

我的问题是:每当我使用spark-submit --packages添加kafka软件包时,我都会

...
:: problems summary ::
:::: ERRORS
    unknown resolver null

:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
:: retrieving :: org.apache.spark#spark-submit-parent
    confs: [default]
    0 artifacts copied, 13 already retrieved (0kB/9ms)
...
Run Code Online (Sandbox Code Playgroud)

即使我根本不使用kafka连接器。详细日志如下:

命令

$SPARK_HOME/bin/spark-submit --packages org.apache.spark:spark-streaming-kafka-0-10_2.11:2.2.0 --master spark://TUSMA06RMLVT047:7077 build/libs/sparkdriver-1.0-SNAPSHOT.jar
Run Code Online (Sandbox Code Playgroud)

日志

Ivy Default Cache set to: /Users/v0001/.ivy2/cache
The jars for the packages stored in: /Users/v0001/.ivy2/jars
:: loading settings :: url = jar:file:/usr/local/Cellar/apache-spark/2.2.0/libexec/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
org.apache.spark#spark-streaming-kafka-0-10_2.11 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
    confs: [default]
    found org.apache.spark#spark-streaming-kafka-0-10_2.11;2.2.0 in local-m2-cache
    found org.apache.kafka#kafka_2.11;0.10.0.1 in local-m2-cache
    found com.101tec#zkclient;0.8 in local-m2-cache
    found org.slf4j#slf4j-api;1.7.16 in spark-list
    found org.slf4j#slf4j-log4j12;1.7.16 in spark-list
    found log4j#log4j;1.2.17 in spark-list
    found com.yammer.metrics#metrics-core;2.2.0 in local-m2-cache
    found org.scala-lang.modules#scala-parser-combinators_2.11;1.0.4 in spark-list
    found org.apache.kafka#kafka-clients;0.10.0.1 in local-m2-cache
    found net.jpountz.lz4#lz4;1.3.0 in spark-list
    found org.xerial.snappy#snappy-java;1.1.2.6 in spark-list
    found org.apache.spark#spark-tags_2.11;2.2.0 in local-m2-cache
    found org.spark-project.spark#unused;1.0.0 in spark-list
:: resolution report :: resolve 1805ms :: artifacts dl 14ms
    :: modules in use:
    com.101tec#zkclient;0.8 from local-m2-cache in [default]
    com.yammer.metrics#metrics-core;2.2.0 from local-m2-cache in [default]
    log4j#log4j;1.2.17 from spark-list in [default]
    net.jpountz.lz4#lz4;1.3.0 from spark-list in [default]
    org.apache.kafka#kafka-clients;0.10.0.1 from local-m2-cache in [default]
    org.apache.kafka#kafka_2.11;0.10.0.1 from local-m2-cache in [default]
    org.apache.spark#spark-streaming-kafka-0-10_2.11;2.2.0 from local-m2-cache in [default]
    org.apache.spark#spark-tags_2.11;2.2.0 from local-m2-cache in [default]
    org.scala-lang.modules#scala-parser-combinators_2.11;1.0.4 from spark-list in [default]
    org.slf4j#slf4j-api;1.7.16 from spark-list in [default]
    org.slf4j#slf4j-log4j12;1.7.16 from spark-list in [default]
    org.spark-project.spark#unused;1.0.0 from spark-list in [default]
    org.xerial.snappy#snappy-java;1.1.2.6 from spark-list in [default]
    ---------------------------------------------------------------------
    |                  |            modules            ||   artifacts   |
    |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
    ---------------------------------------------------------------------
    |      default     |   13  |   2   |   2   |   0   ||   13  |   0   |
    ---------------------------------------------------------------------

:: problems summary ::
:::: ERRORS
    unknown resolver null


:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
:: retrieving :: org.apache.spark#spark-submit-parent
    confs: [default]
    0 artifacts copied, 13 already retrieved (0kB/9ms)
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/11/08 15:53:55 INFO SparkContext: Running Spark version 2.2.0
17/11/08 15:53:55 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/11/08 15:53:55 INFO SparkContext: Submitted application: WordCount
17/11/08 15:53:55 INFO SecurityManager: Changing view acls to: v0001
17/11/08 15:53:55 INFO SecurityManager: Changing modify acls to: v0001
17/11/08 15:53:55 INFO SecurityManager: Changing view acls groups to: 
17/11/08 15:53:55 INFO SecurityManager: Changing modify acls groups to: 
17/11/08 15:53:55 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(v0001); groups with view permissions: Set(); users  with modify permissions: Set(v0001); groups with modify permissions: Set()
17/11/08 15:53:55 INFO Utils: Successfully started service 'sparkDriver' on port 63760.
17/11/08 15:53:55 INFO SparkEnv: Registering MapOutputTracker
17/11/08 15:53:55 INFO SparkEnv: Registering BlockManagerMaster
17/11/08 15:53:55 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
17/11/08 15:53:55 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
17/11/08 15:53:55 INFO DiskBlockManager: Created local directory at /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/blockmgr-b6a7af13-30eb-43ef-a235-e42105699289
17/11/08 15:53:55 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
17/11/08 15:53:55 INFO SparkEnv: Registering OutputCommitCoordinator
17/11/08 15:53:55 INFO Utils: Successfully started service 'SparkUI' on port 4040.
17/11/08 15:53:55 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.0.1.2:4040
17/11/08 15:53:55 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/org.apache.spark_spark-streaming-kafka-0-10_2.11-2.2.0.jar at spark://10.0.1.2:63760/jars/org.apache.spark_spark-streaming-kafka-0-10_2.11-2.2.0.jar with timestamp 1510174435998
17/11/08 15:53:55 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/org.apache.kafka_kafka_2.11-0.10.0.1.jar at spark://10.0.1.2:63760/jars/org.apache.kafka_kafka_2.11-0.10.0.1.jar with timestamp 1510174435999
17/11/08 15:53:55 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/org.apache.spark_spark-tags_2.11-2.2.0.jar at spark://10.0.1.2:63760/jars/org.apache.spark_spark-tags_2.11-2.2.0.jar with timestamp 1510174435999
17/11/08 15:53:55 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/org.spark-project.spark_unused-1.0.0.jar at spark://10.0.1.2:63760/jars/org.spark-project.spark_unused-1.0.0.jar with timestamp 1510174435999
17/11/08 15:53:55 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/com.101tec_zkclient-0.8.jar at spark://10.0.1.2:63760/jars/com.101tec_zkclient-0.8.jar with timestamp 1510174435999
17/11/08 15:53:55 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/org.slf4j_slf4j-log4j12-1.7.16.jar at spark://10.0.1.2:63760/jars/org.slf4j_slf4j-log4j12-1.7.16.jar with timestamp 1510174435999
17/11/08 15:53:55 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/com.yammer.metrics_metrics-core-2.2.0.jar at spark://10.0.1.2:63760/jars/com.yammer.metrics_metrics-core-2.2.0.jar with timestamp 1510174435999
17/11/08 15:53:56 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/org.scala-lang.modules_scala-parser-combinators_2.11-1.0.4.jar at spark://10.0.1.2:63760/jars/org.scala-lang.modules_scala-parser-combinators_2.11-1.0.4.jar with timestamp 1510174436000
17/11/08 15:53:56 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/org.apache.kafka_kafka-clients-0.10.0.1.jar at spark://10.0.1.2:63760/jars/org.apache.kafka_kafka-clients-0.10.0.1.jar with timestamp 1510174436000
17/11/08 15:53:56 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/org.slf4j_slf4j-api-1.7.16.jar at spark://10.0.1.2:63760/jars/org.slf4j_slf4j-api-1.7.16.jar with timestamp 1510174436000
17/11/08 15:53:56 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/log4j_log4j-1.2.17.jar at spark://10.0.1.2:63760/jars/log4j_log4j-1.2.17.jar with timestamp 1510174436000
17/11/08 15:53:56 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/net.jpountz.lz4_lz4-1.3.0.jar at spark://10.0.1.2:63760/jars/net.jpountz.lz4_lz4-1.3.0.jar with timestamp 1510174436000
17/11/08 15:53:56 INFO SparkContext: Added JAR file:/Users/v0001/.ivy2/jars/org.xerial.snappy_snappy-java-1.1.2.6.jar at spark://10.0.1.2:63760/jars/org.xerial.snappy_snappy-java-1.1.2.6.jar with timestamp 1510174436000
17/11/08 15:53:56 INFO SparkContext: Added JAR file:/Users/v0001/iot/thingspace/go/src/stash.verizon.com/npdthing/metrics/sparkdriver/build/libs/sparkdriver-1.0-SNAPSHOT.jar at spark://10.0.1.2:63760/jars/sparkdriver-1.0-SNAPSHOT.jar with timestamp 1510174436000
17/11/08 15:53:56 INFO Executor: Starting executor ID driver on host localhost
17/11/08 15:53:56 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 63761.
17/11/08 15:53:56 INFO NettyBlockTransferService: Server created on 10.0.1.2:63761
17/11/08 15:53:56 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
17/11/08 15:53:56 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.0.1.2, 63761, None)
17/11/08 15:53:56 INFO BlockManagerMasterEndpoint: Registering block manager 10.0.1.2:63761 with 366.3 MB RAM, BlockManagerId(driver, 10.0.1.2, 63761, None)
17/11/08 15:53:56 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.0.1.2, 63761, None)
17/11/08 15:53:56 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.0.1.2, 63761, None)
17/11/08 15:53:56 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 236.5 KB, free 366.1 MB)
17/11/08 15:53:56 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 22.9 KB, free 366.0 MB)
17/11/08 15:53:56 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 10.0.1.2:63761 (size: 22.9 KB, free: 366.3 MB)
17/11/08 15:53:56 INFO SparkContext: Created broadcast 0 from textFile at WordCount.java:15
17/11/08 15:53:56 INFO FileInputFormat: Total input paths to process : 1
17/11/08 15:53:56 INFO FileOutputCommitter: File Output Committer Algorithm version is 1
17/11/08 15:53:56 INFO SparkContext: Starting job: saveAsTextFile at WordCount.java:21
17/11/08 15:53:56 INFO DAGScheduler: Registering RDD 2 (flatMapToPair at WordCount.java:18)
17/11/08 15:53:56 INFO DAGScheduler: Got job 0 (saveAsTextFile at WordCount.java:21) with 1 output partitions
17/11/08 15:53:56 INFO DAGScheduler: Final stage: ResultStage 1 (saveAsTextFile at WordCount.java:21)
17/11/08 15:53:56 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 0)
17/11/08 15:53:56 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 0)
17/11/08 15:53:56 INFO DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[2] at flatMapToPair at WordCount.java:18), which has no missing parents
17/11/08 15:53:56 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 5.3 KB, free 366.0 MB)
17/11/08 15:53:56 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 3.1 KB, free 366.0 MB)
17/11/08 15:53:56 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 10.0.1.2:63761 (size: 3.1 KB, free: 366.3 MB)
17/11/08 15:53:56 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1006
17/11/08 15:53:56 INFO DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[2] at flatMapToPair at WordCount.java:18) (first 15 tasks are for partitions Vector(0))
17/11/08 15:53:56 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
17/11/08 15:53:56 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 4937 bytes)
17/11/08 15:53:56 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
17/11/08 15:53:56 INFO Executor: Fetching spark://10.0.1.2:63760/jars/org.slf4j_slf4j-api-1.7.16.jar with timestamp 1510174436000
17/11/08 15:53:57 INFO TransportClientFactory: Successfully created connection to /10.0.1.2:63760 after 30 ms (0 ms spent in bootstraps)
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/org.slf4j_slf4j-api-1.7.16.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp4839646631087629609.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/org.slf4j_slf4j-api-1.7.16.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/org.apache.kafka_kafka_2.11-0.10.0.1.jar with timestamp 1510174435999
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/org.apache.kafka_kafka_2.11-0.10.0.1.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp8667361266232337100.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/org.apache.kafka_kafka_2.11-0.10.0.1.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/org.slf4j_slf4j-log4j12-1.7.16.jar with timestamp 1510174435999
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/org.slf4j_slf4j-log4j12-1.7.16.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp5418243157152191799.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/org.slf4j_slf4j-log4j12-1.7.16.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/org.scala-lang.modules_scala-parser-combinators_2.11-1.0.4.jar with timestamp 1510174436000
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/org.scala-lang.modules_scala-parser-combinators_2.11-1.0.4.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp2366789843424249528.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/org.scala-lang.modules_scala-parser-combinators_2.11-1.0.4.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/org.apache.spark_spark-tags_2.11-2.2.0.jar with timestamp 1510174435999
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/org.apache.spark_spark-tags_2.11-2.2.0.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp2527586655699915856.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/org.apache.spark_spark-tags_2.11-2.2.0.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/org.spark-project.spark_unused-1.0.0.jar with timestamp 1510174435999
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/org.spark-project.spark_unused-1.0.0.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp4436635514367901872.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/org.spark-project.spark_unused-1.0.0.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/com.101tec_zkclient-0.8.jar with timestamp 1510174435999
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/com.101tec_zkclient-0.8.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp4322710809557945921.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/com.101tec_zkclient-0.8.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/org.apache.spark_spark-streaming-kafka-0-10_2.11-2.2.0.jar with timestamp 1510174435998
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/org.apache.spark_spark-streaming-kafka-0-10_2.11-2.2.0.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp6210645736090344233.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/org.apache.spark_spark-streaming-kafka-0-10_2.11-2.2.0.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/log4j_log4j-1.2.17.jar with timestamp 1510174436000
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/log4j_log4j-1.2.17.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp2587760876873828850.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/log4j_log4j-1.2.17.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/com.yammer.metrics_metrics-core-2.2.0.jar with timestamp 1510174435999
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/com.yammer.metrics_metrics-core-2.2.0.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp8763096223513955185.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/com.yammer.metrics_metrics-core-2.2.0.jar to class loader
17/11/08 15:53:57 INFO Executor: Fetching spark://10.0.1.2:63760/jars/org.apache.kafka_kafka-clients-0.10.0.1.jar with timestamp 1510174436000
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/org.apache.kafka_kafka-clients-0.10.0.1.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp2368772990989848791.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/org.apache.kafka_kafka-clients-0.10.0.1.jar to class loader
17/11/08 15:53:57 INFO Utils: Fetching spark://10.0.1.2:63760/jars/org.xerial.snappy_snappy-java-1.1.2.6.jar to /private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/fetchFileTemp5933403694236070460.tmp
17/11/08 15:53:57 INFO Executor: Adding file:/private/var/folders/26/lfnkdm_d6mj48xfwdyl_sntm8tdtp9/T/spark-07e33210-a6be-447c-92a0-4dd504e80965/userFiles-9a4b5741-a318-420e-b112-1a4441ba030a/org.xerial.snappy_snappy-java-1.1.2.6.jar to 

Mat*_*att 4

遇到同样的问题后,我删除了位于 下的 Ivy2 缓存~/.ivy2和位于~/.m2. 这解决了我时不时遇到的各种不同软件包的问题。主要是我在从一个 Scala 版本切换到另一个版本时遇到这个问题。