我正在按照安装hadoop的教程进行操作:http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/ 现在我被困在"复制本地示例数据"到HDFS"一步.
我得到的连接错误:
<12/10/26 17:29:16 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 0 time(s).
12/10/26 17:29:17 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 1 time(s).
12/10/26 17:29:18 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 2 time(s).
12/10/26 17:29:19 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 3 time(s).
12/10/26 17:29:20 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. Already tried 4 time(s).
12/10/26 17:29:21 INFO ipc.Client: Retrying connect to server: localhost/127.0.0.1:54310. …Run Code Online (Sandbox Code Playgroud) 我想访问包含数据的 json 页面。比如说,它在http://example.com/home/apples.json
它的保护之下,所以显示了一个登录页面。如果我要手动访问它,我首先去http://example.com/那里我得到一个登录菜单,登录后说用户“test”和密码“test”访问http://example.com/home/apples.json again将向我显示json数据。
如果我做一个
curl -u test:test http://example.com/home/apples.json -v
Run Code Online (Sandbox Code Playgroud)
我最终在登录页面。(尽管经过身份验证。)
如果我访问http://example.com/并手动登录,我会在登录后得到一个 cookie。假设这个 cookie 被称为 ltoken 并获得值“dj8f”。它只会在成功登录后显示。当我通过浏览器查找并复制 cookie 数据并将其附加到 curl 命令时:
curl -b "ltoken="dj8f" http://example.com/home/apples.json -v
Run Code Online (Sandbox Code Playgroud)
它会起作用。
如何在不手动执行的情况下从登录后获取 cookie 数据?是否可以通过 bash shell 脚本编写方式?如果是如何?或者,如何在 groovy 脚本中完成?