运行spark代码时如何解决“无法分配请求的地址:服务'sparkDriver'在16次重试后失败”?

Bra*_*avo 10 scala apache-spark

我正在用 intelliJ 学习 spark + scala,从下面的一小段代码开始

import org.apache.spark.{SparkConf, SparkContext}

object ActionsTransformations {

  def main(args: Array[String]): Unit = {
    //Create a SparkContext to initialize Spark
    val conf = new SparkConf()
    conf.setMaster("local")
    conf.setAppName("Word Count")
    val sc = new SparkContext(conf)

    val numbersList = sc.parallelize(1.to(10000).toList)

    println(numbersList)
  }

}
Run Code Online (Sandbox Code Playgroud)

尝试运行时,低于异常

Exception in thread "main" java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:433)
    at sun.nio.ch.Net.bind(Net.java:425)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:496)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:481)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:446)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
    at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
    at java.lang.Thread.run(Thread.java:745)

Process finished with exit code 1
Run Code Online (Sandbox Code Playgroud)

任何人都可以建议该怎么做。

小智 15

似乎您使用了一些旧版本的火花。在您的情况下,请尝试添加以下行:

conf.set("spark.driver.bindAddress", "127.0.0.1")
Run Code Online (Sandbox Code Playgroud)

如果您将使用 spark 2.0+ 以下应该可以解决问题:

val spark: SparkSession = SparkSession.builder()
.appName("Word Count")
.master("local[*]")
.config("spark.driver.bindAddress", "127.0.0.1")
.getOrCreate()
Run Code Online (Sandbox Code Playgroud)


小智 15

在位于 spark/bin 目录的 load-spark-env.sh 文件中添加 SPARK_LOCAL_IP

导出 SPARK_LOCAL_IP="127.0.0.1"


Gio*_*ous 12

以下应该可以解决问题:

sudo hostname -s 127.0.0.1
Run Code Online (Sandbox Code Playgroud)


小智 5

有时问题与连接的 VPN 或类似问题有关!只需断开您的 VPN 或任何其他可能影响您的网络连接的工具,然后再试一次。


Soh*_*ani 0

我认为setMastersetAppName将返回一个新SparkConf对象,并且该行conf.setMaster("local")不会对conf变量产生影响。所以你应该尝试:

val conf = new SparkConf()
    .setMaster("local[*]")
    .setAppName("Word Count")
Run Code Online (Sandbox Code Playgroud)