使用 curl 下载文件中列出的网址?

Dev*_*Dev 20 curl

我有一个文件,其中包含我需要从中下载的所有 url。但是我需要限制一次下载。即下一个下载应该只在前一个下载完成后开始。这可以使用 curl 吗?或者我应该使用其他任何东西。

Gru*_*rig 28

xargs -n 1 curl -O < your_files.txt
Run Code Online (Sandbox Code Playgroud)

  • 这是最好的答案。尽管提问者没有指定,但假设所有 URL 的响应都应该写入单个文件,这可能是安全的。使用带有 cURL 的 `-O` 选项来做到这一点。`xargs -n 1 curl -O &lt; your_file.txt` (3认同)

daw*_*wud 26

wget(1) 默认情况下按顺序工作,并内置此选项:

   -i file
   --input-file=file
       Read URLs from a local or external file.  If - is specified as file, URLs are read from the standard input.  (Use ./- to read from a file literally named -.)

       If this function is used, no URLs need be present on the command line.  If there are URLs both on the command line and in an input file, those on the command lines will be the first ones to be retrieved.  If
       --force-html is not specified, then file should consist of a series of URLs, one per line.

       However, if you specify --force-html, the document will be regarded as html.  In that case you may have problems with relative links, which you can solve either by adding "<base href="url">" to the documents
       or by specifying --base=url on the command line.

       If the file is an external one, the document will be automatically treated as html if the Content-Type matches text/html.  Furthermore, the file's location will be implicitly used as base href if none was
       specified.
Run Code Online (Sandbox Code Playgroud)

  • 由于提问者想知道如何使用 cURL 执行此操作,因此您至少应该包含一个尝试使用它的解决方案。 (5认同)

use*_*517 4

这可以在 shell 脚本中使用curl,类似这样,但您需要自己研究curl等的适当选项

while read URL
    curl some options $URL
    if required check exit status 
          take appropriate action
done <fileontainingurls
Run Code Online (Sandbox Code Playgroud)

  • 我知道这是半伪代码,但我认为 while 循环仍然应该有一个“do”。 (2认同)