无法创建SparkContext

Jin*_*Yoo 7 scala apache-kafka apache-spark

我用scala代码在spark-shell中测试Spark.我正在构建使用Kafka和Spark的原型.

我在spark-shell下面运行了.

spark-shell --jars ~/spark/external/kafka-assembly/target/spark-streaming-kafka-assembly_2.10-1.3.1.jar
Run Code Online (Sandbox Code Playgroud)

我在shell中运行了下面的代码.

import kafka.serializer.StringDecoder
import org.apache.spark.streaming._
import org.apache.spark.streaming.kafka._
import org.apache.spark.SparkConf


// Create context with 2 second batch interval
val sparkConf = new SparkConf().setAppName("DirectKafkaWordCount")
val ssc = new StreamingContext(sparkConf, Seconds(2) )
Run Code Online (Sandbox Code Playgroud)

然后我在创建时发现了错误ssc.spark-shell告诉我下面的消息.

scala> val ssc = new StreamingContext(sparkConf, Seconds(2) )
15/06/05 09:06:08 INFO SparkContext: Running Spark version 1.3.1
15/06/05 09:06:08 INFO SecurityManager: Changing view acls to: vagrant
15/06/05 09:06:08 INFO SecurityManager: Changing modify acls to: vagrant
15/06/05 09:06:08 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(vagrant); users with modify permissions: Set(vagrant)
15/06/05 09:06:08 INFO Slf4jLogger: Slf4jLogger started
15/06/05 09:06:08 INFO Remoting: Starting remoting
15/06/05 09:06:08 INFO Utils: Successfully started service 'sparkDriver' on port 51270.
15/06/05 09:06:08 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@localhost:51270]
15/06/05 09:06:08 INFO SparkEnv: Registering MapOutputTracker
15/06/05 09:06:08 INFO SparkEnv: Registering BlockManagerMaster
15/06/05 09:06:08 INFO DiskBlockManager: Created local directory at /tmp/spark-d3349ba2-125b-4dda-83fa-abfa6c692143/blockmgr-c0e59bba-c4df-423f-b147-ac55d9bd5ccf
15/06/05 09:06:08 INFO MemoryStore: MemoryStore started with capacity 267.3 MB
15/06/05 09:06:08 INFO HttpFileServer: HTTP File server directory is /tmp/spark-842c15d5-7e3f-49c8-a4d0-95bdf5c6b049/httpd-26f5e751-8406-4a97-9ed3-aa79fc46bc6e
15/06/05 09:06:08 INFO HttpServer: Starting HTTP Server
15/06/05 09:06:08 INFO Server: jetty-8.y.z-SNAPSHOT
15/06/05 09:06:08 INFO AbstractConnector: Started SocketConnector@0.0.0.0:55697
15/06/05 09:06:08 INFO Utils: Successfully started service 'HTTP file server' on port 55697.
15/06/05 09:06:08 INFO SparkEnv: Registering OutputCommitCoordinator
15/06/05 09:06:08 INFO Server: jetty-8.y.z-SNAPSHOT
15/06/05 09:06:08 WARN AbstractLifeCycle: FAILED SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address already in use
java.net.BindException: Address already in use
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:444)
        at sun.nio.ch.Net.bind(Net.java:436)
        at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
        at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
        at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
        at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
        at org.spark-project.jetty.server.Server.doStart(Server.java:293)
        at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
        at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:199)
        at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:209)
        at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:209)
        at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1837)
        at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
        at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1828)
        at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:209)
        at org.apache.spark.ui.WebUI.bind(WebUI.scala:120)
        at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:309)
        at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:309)
        at scala.Option.foreach(Option.scala:236)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:309)
        at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:643)
        at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:34)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
        at $line35.$read$$iwC$$iwC$$iwC.<init>(<console>:50)
        at $line35.$read$$iwC$$iwC.<init>(<console>:52)
        at $line35.$read$$iwC.<init>(<console>:54)
        at $line35.$read.<init>(<console>:56)
        at $line35.$read$.<init>(<console>:60)
        at $line35.$read$.<clinit>(<console>)
        at $line35.$eval$.<init>(<console>:7)
        at $line35.$eval$.<clinit>(<console>)
        at $line35.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
        at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
        at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/06/05 09:06:08 WARN AbstractLifeCycle: FAILED org.spark-project.jetty.server.Server@e067ac3: java.net.BindException: Address already in use
java.net.BindException: Address already in use
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:444)
        at sun.nio.ch.Net.bind(Net.java:436)
        at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
        at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
        at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
        at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
        at org.spark-project.jetty.server.Server.doStart(Server.java:293)
        at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
        at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:199)
        at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:209)
        at org.apache.spark.ui.JettyUtils$$anonfun$2.apply(JettyUtils.scala:209)
        at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1837)
        at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
        at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1828)
        at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:209)
        at org.apache.spark.ui.WebUI.bind(WebUI.scala:120)
        at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:309)
        at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:309)
        at scala.Option.foreach(Option.scala:236)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:309)
        at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:643)
        at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:34)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
        at $line35.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
        at $line35.$read$$iwC$$iwC$$iwC.<init>(<console>:50)
        at $line35.$read$$iwC$$iwC.<init>(<console>:52)
        at $line35.$read$$iwC.<init>(<console>:54)
        at $line35.$read.<init>(<console>:56)
        at $line35.$read$.<init>(<console>:60)
        at $line35.$read$.<clinit>(<console>)
        at $line35.$eval$.<init>(<console>:7)
        at $line35.$eval$.<clinit>(<console>)
        at $line35.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
        at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
        at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
15/06/05 09:06:08 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
15/06/05 09:06:08 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
15/06/05 09:06:08 INFO Server: jetty-8.y.z-SNAPSHOT
15/06/05 09:06:08 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0:4041
15/06/05 09:06:08 INFO Utils: Successfully started service 'SparkUI' on port 4041.
15/06/05 09:06:08 INFO SparkUI: Started SparkUI at http://localhost:4041
15/06/05 09:06:08 INFO SparkContext: Added JAR file:/home/vagrant/spark/external/kafka-assembly/target/spark-streaming-kafka-assembly_2.10-1.3.1.jar at http://10.0.2.15:55697/jars/spark-streaming-kafka-assembly_2.10-1.3.1.jar with timestamp 1433495168735
15/06/05 09:06:08 INFO Executor: Starting executor ID <driver> on host localhost
15/06/05 09:06:08 INFO AkkaUtils: Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@localhost:51270/user/HeartbeatReceiver
15/06/05 09:06:08 INFO NettyBlockTransferService: Server created on 37393
15/06/05 09:06:08 INFO BlockManagerMaster: Trying to register BlockManager
15/06/05 09:06:08 INFO BlockManagerMasterActor: Registering block manager localhost:37393 with 267.3 MB RAM, BlockManagerId(<driver>, localhost, 37393)
15/06/05 09:06:08 INFO BlockManagerMaster: Registered BlockManager
org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:80)
org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1016)
$iwC$$iwC.<init>(<console>:9)
$iwC.<init>(<console>:18)
<init>(<console>:20)
.<init>(<console>:24)
.<clinit>(<console>)
.<init>(<console>:7)
.<clinit>(<console>)
$print(<console>)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:606)
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
        at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(SparkContext.scala:1812)
        at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1$$anonfun$apply$10.apply(SparkContext.scala:1808)
        at scala.Option.foreach(Option.scala:236)
        at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1808)
        at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:1795)
        at scala.Option.foreach(Option.scala:236)
        at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:1795)
        at org.apache.spark.SparkContext$.setActiveContext(SparkContext.scala:1847)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:1754)
        at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:643)
        at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:75)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:34)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:40)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:42)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:44)
        at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:46)
        at $iwC$$iwC$$iwC$$iwC.<init>(<console>:48)
        at $iwC$$iwC$$iwC.<init>(<console>:50)
        at $iwC$$iwC.<init>(<console>:52)
        at $iwC.<init>(<console>:54)
        at <init>(<console>:56)
        at .<init>(<console>:60)
        at .<clinit>(<console>)
        at .<init>(<console>:7)
        at .<clinit>(<console>)
        at $print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)
        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)
        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)
        at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)
        at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
        at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)
        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)
        at org.apache.spark.repl.Main$.main(Main.scala:31)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Run Code Online (Sandbox Code Playgroud)

我想知道为什么StreamingContext会出错.你能揭开这个问题吗?

我还检查了4040端口.

这是运行前打开的端口列表spark-shell.

vagrant@vagrant-ubuntu-trusty-64:~$ netstat -an | grep "LISTEN "
tcp        0      0 0.0.0.0:22              0.0.0.0:*               LISTEN
tcp        0      0 0.0.0.0:47078           0.0.0.0:*               LISTEN
tcp        0      0 0.0.0.0:111             0.0.0.0:*               LISTEN
tcp6       0      0 :::22                   :::*                    LISTEN
tcp6       0      0 :::44461                :::*                    LISTEN
tcp6       0      0 :::111                  :::*                    LISTEN
tcp6       0      0 :::80                   :::*                    LISTEN
Run Code Online (Sandbox Code Playgroud)

这是运行后打开的端口列表spark-shell.

vagrant@vagrant-ubuntu-trusty-64:~$ netstat -an | grep "LISTEN "
tcp        0      0 0.0.0.0:22              0.0.0.0:*               LISTEN
tcp        0      0 0.0.0.0:47078           0.0.0.0:*               LISTEN
tcp        0      0 0.0.0.0:111             0.0.0.0:*               LISTEN
tcp6       0      0 :::22                   :::*                    LISTEN
tcp6       0      0 :::55233                :::*                    LISTEN
tcp6       0      0 :::4040                 :::*                    LISTEN
tcp6       0      0 10.0.2.15:41545         :::*                    LISTEN
tcp6       0      0 :::44461                :::*                    LISTEN
tcp6       0      0 :::111                  :::*                    LISTEN
tcp6       0      0 :::56784                :::*                    LISTEN
tcp6       0      0 :::80                   :::*                    LISTEN
tcp6       0      0 :::39602                :::*                    LISTEN
Run Code Online (Sandbox Code Playgroud)

blu*_*kin 11

启动spark-shell时会创建默认的SparkContext"sc".您正在使用的构造函数方法尝试创建另一个SparkContext实例,而这不是您应该执行的操作.你应该做的是使用现有的sparkContext使用重载的构造函数构造StreamingContext

new StreamingContext(sparkContext: SparkContext, batchDuration: Duration) 
Run Code Online (Sandbox Code Playgroud)

所以现在你的代码应该是这样的,

// Set the existing SparkContext's Master, AppName and other params
sc.getConf.setMaster("local[2]").setAppName("NetworkWordCount").set("spark.ui.port", "44040" )
// Use 'sc' to create a Streaming context with 2 second batch interval
val ssc = new StreamingContext(sc, Seconds(2) )
Run Code Online (Sandbox Code Playgroud)