用所有大文件复制Hg repo

Chr*_*nes 7 mercurial

我们有一个带有大文件的大型旧存储库.我想使用运行的cron脚本将存储库复制到备份服务器hg pull.但是,此命令不会检索大文件.

我目前有2GB的历史记录复制,但我缺少6GB的大文件.如何让Hg下载这些重要文件?

Mat*_*sdm 11

默认情况下,只会下载您更新到的修订版的大文件.

'hg help largefiles'说:

When you pull a changeset that affects largefiles from a remote repository,
the largefiles for the changeset will by default not be pulled down. However,
when you update to such a revision, any largefiles needed by that revision are
downloaded and cached (if they have never been downloaded before). One way to
pull largefiles when pulling is thus to use --update, which will update your
working copy to the latest pulled revision (and thereby downloading any new
largefiles).

If you want to pull largefiles you don't need for update yet, then you can use
pull with the "--lfrev" option or the "hg lfpull" command.
Run Code Online (Sandbox Code Playgroud)

您应该能够为此目的使用'hg lfpull --rev"all()"'.