本文介绍了Git类似于Hg的Bigfiles Extension?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想要一个类似于的东西(注意:我知道,但这是不相关的)。



基本上我想将大的二进制文件存储在我的git存储库中,但是当我做一个克隆时,我不想获得大二进制文件的所有版本。我只想在检出包含这些大文件的特定修订版时下载大的二进制文件。

解决方案

以下是一些选项考虑:

浅克隆:您可以添加 - 深度<深度> 参数为 git clone 来获取存储库的浅表副本。例如如果< depth> 是1,这意味着克隆将仅抓取最近提交所需的文件。但是,这些存储库对您可以对其执行的操作具有尴尬的限制,如 git clone 手册页中所述:

 
--depth
创建一个历史记录截断为指定
修订版的浅表副本。一个浅仓库有许多
的限制(你不能从它克隆或获取,也不能从b $ b推入),但如果你只关心最近
的历史一个历史悠久的大型项目,并希望
将补丁发送给补丁。

事实上,正如这有点夸大其辞 - 有些情况下,从浅层克隆推送仍然有效,而且有可能将符合您的工作流程。

Scott Chacon的git media扩展:作者在回答,并在自述文件中github:。



浅子模块:您可以将所有大文件保存在单独的git存储库中,并将其添加为给你主存储库。这样做的好处是你没有浅码克隆的限制,只有大文件的存储库。



还有很多方法通过在git钩子中添加钩子(例如)在你的大文件中使用rsync来实现这一点,但我认为有很好的理由让你首先将这些文件保存在git的控制之下。



我希望这有一些帮助。


I want something in git that is similar to Mercurial's Bigfiles Extension (note: I know of git-bigfiles, but that is unrelated).

Basically I want to store large binaries in my git repository, but I don't want to get every version ever of the large binary when I do a clone. I only want to download the large binaries when I checkout a specific revision containing those large files.

解决方案

Here are a few options to consider:

Shallow clones: You can add the --depth <depth> parameter to git clone to get a shallow clone of the repository. e.g. if <depth> is 1, this means that the clone will only fetch the files needed for the most recent commit. However, such repositories have awkward restrictions on what you can do with them, as outlined in the git clone man page:

        --depth
           Create a shallow clone with a history truncated to the specified
           number of revisions. A shallow repository has a number of
           limitations (you cannot clone or fetch from it, nor push from nor
           into it), but is adequate if you are only interested in the recent
           history of a large project with a long history, and would want to
           send in fixes as patches.

In fact, as discussed in this thread that's something of an overstatement - there are useful situations where pushing from a shallow clone will still work, and it's possible that will fit your workflow.

Scott Chacon's "git media" extension: the author describes this in answer to this similar question and in the README on github: http://github.com/schacon/git-media .

Shallow submodules: you could keep all your large files in a separate git repository and add that as a shallow submodule to your main repository. This would have the advantage that you don't have the restrictions of shallow clones for your code, just the repository with the large files.

There also are any number of ways of doing this by adding hooks that (for example) rsync over your large files in from git hooks, but I assume that there are good reasons that you want to keep these files under git's control in the first place.

I hope that's of some help.

这篇关于Git类似于Hg的Bigfiles Extension?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-01 15:49