如何连接在同一主机上运行的两个 docker 容器?

Gib*_*bbs 4 hadoop docker

我有两个 docker 容器正在运行

   docker ps
Run Code Online (Sandbox Code Playgroud)

结果

  CONTAINER ID        IMAGE                COMMAND               CREATED             STATUS    PORTS NAMES

  0bfd25abbfc6        f_service:latest    "/usr/local/start-fl   13 seconds ago      Up 2 seconds        0.0.0.0:8081->8081/tcp   flume

  6a1d974f4e3e        h_service:latest    "/usr/local/start-al   2 minutes ago       Up About a minute   0.0.0.0:8080->8080/tcp   hadoop
Run Code Online (Sandbox Code Playgroud)

Hadoop 服务在 hadoop 容器中运行 [即 datanode、namenode、jobtracker、tasktracker、secondarynamenode]

Flume 服务在水槽容器上运行 [即水槽代理]

我想在水槽容器上运行 hadoop 命令。[即 hadoop fs -ls /] 怎么做?有任何想法吗?

我尝试链接,但无法实现。

容器的运行命令:

  docker run -it --name hadoop -p 8080:8080 h_service

  jps on hadoop container shows all hadoop services

  docker run -it -p 8081:8081 --name flume --link hadoop:hadoop f_service

  jps on flume shows
  jps and Application.[which is flume i guess]
Run Code Online (Sandbox Code Playgroud)

如果我在水槽容器内执行任何 hadoop 命令,我会收到以下错误

 mkdir: Call From 282fc55ec08d/172.17.5.236 to localhost:8020 failed on connection exception: 
 java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org
 /hadoop/ConnectionRefused
Run Code Online (Sandbox Code Playgroud)

telnet localhost 8020

无法连接到远程主机。8080也一样。

netstat 水槽容器内

 netstat -na
 Active Internet connections (servers and established)
 Proto Recv-Q Send-Q Local Address           Foreign Address         State
 Active UNIX domain sockets (servers and established)
 Proto RefCnt Flags       Type       State         I-Node   Path
Run Code Online (Sandbox Code Playgroud)

hadoop容器上的netstat显示

    netstat
    Active Internet connections (w/o servers)
    Proto Recv-Q Send-Q Local Address           Foreign Address         State
    tcp        0      0 localhost:49096         localhost:8020          TIME_WAIT
    tcp        0      0 localhost:49079         localhost:8020          ESTABLISHED
    tcp        0      0 localhost:8020          localhost:49079         ESTABLISHED
    tcp        0      0 c0c82bab5efd:54003      likho.canonical.com:80  TIME_WAIT
    tcp6       0      0 localhost:8021          localhost:40735         ESTABLISHED
    tcp6       0      0 localhost:40735         localhost:8021          ESTABLISHED
    Active UNIX domain sockets (w/o servers)
    Proto RefCnt Flags       Type       State         I-Node   Path
    unix  2      [ ]         STREAM     CONNECTED     9223040
    unix  2      [ ]         STREAM     CONNECTED     9223013
    unix  2      [ ]         STREAM     CONNECTED     9222782
    unix  2      [ ]         STREAM     CONNECTED     9222116
    unix  2      [ ]         STREAM     CONNECTED     9221761
    unix  2      [ ]         STREAM     CONNECTED     9221758
    unix  2      [ ]         STREAM     CONNECTED     9221302
    unix  2      [ ]         STREAM     CONNECTED     9221284
    unix  2      [ ]         STREAM     CONNECTED     9220884
    unix  2      [ ]         STREAM     CONNECTED     9220877
Run Code Online (Sandbox Code Playgroud)

其中localhost:8020,我猜8020来自core-site.xml的规范

all*_*eek 5

这个有一个简单的解决方案。首先,如果您想连接到您的 hadoop 容器的端口 8020,您应该确保该端口也是公开的。其次,这些容器都有自己的环回接口 (localhost) 和 IP 地址。它们通过桥接网络 docker0 连接到主机的 eth0 接口。所以,你需要使用Docker注入flume容器的IP地址。

所以这些将正确启动容器:

docker run -it --name hadoop --expose 8080 --expose 8020 h_service
docker run -it --name flume --link hadoop:had00p -p 8081:8081 f_service
Run Code Online (Sandbox Code Playgroud)

但是你需要告诉flume在“had00p”而不是“localhost”连接到hadoop。我在这里使用 had00p 只是为了将容器内的别名与您为运行 hadoop 的容器提供的容器名称区分开来。

这是一个简单的例子:

docker run -d --name container_a --expose 8080 busybox:latest nc -l 0.0.0.0 8080
docker run --rm --link container_a:dep_alias busybox:latest env
docker run --rm --link container_a:dep_alias busybox:latest cat /etc/hosts
Run Code Online (Sandbox Code Playgroud)

当 Docker 创建应用程序链接时,它会注入许多环境变量并将主机名添加到链接容器的 /etc/hosts 文件中。如果禁用了容器间通信,它还会添加防火墙规则以允许两个容器之间的通信。