我是Spark的新手,我在本地启动了Zookeeper,kafka(0.10.1.1),同时也启动了具有一名管理员和2名工人的独立(2.2.0)。我的本地scal版本是2.12.3
我能够在spark上运行wordcount,并使用kafka控制台生产者和使用者从kafka主题发布/订阅消息。
我的问题是:每当我使用spark-submit --packages添加kafka软件包时,我都会
...
:: problems summary ::
:::: ERRORS
unknown resolver null
:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
:: retrieving :: org.apache.spark#spark-submit-parent
confs: [default]
0 artifacts copied, 13 already retrieved (0kB/9ms)
...
Run Code Online (Sandbox Code Playgroud)
即使我根本不使用kafka连接器。详细日志如下:
命令
$SPARK_HOME/bin/spark-submit --packages org.apache.spark:spark-streaming-kafka-0-10_2.11:2.2.0 --master spark://TUSMA06RMLVT047:7077 build/libs/sparkdriver-1.0-SNAPSHOT.jar
Run Code Online (Sandbox Code Playgroud)
日志
Ivy Default Cache set to: /Users/v0001/.ivy2/cache
The jars for the packages stored in: /Users/v0001/.ivy2/jars
:: loading settings :: url = jar:file:/usr/local/Cellar/apache-spark/2.2.0/libexec/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
org.apache.spark#spark-streaming-kafka-0-10_2.11 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
confs: [default] …Run Code Online (Sandbox Code Playgroud)