git.vger.kernel.org archive mirror
 help / color / mirror / Atom feed
* Roadmap to better handle big files?
@ 2010-02-24 23:00 Nick Triantos
  2010-02-24 23:39 ` Nicolas Pitre
                   ` (2 more replies)
  0 siblings, 3 replies; 5+ messages in thread
From: Nick Triantos @ 2010-02-24 23:00 UTC (permalink / raw)
  To: git@vger.kernel.org

Hi,

Is there any planned functionality to better support large files in git?  (> 100MB / file)

We've been happily using git but we now have some files which we'd very much like to have under the same version control as our source code, and some of those files have been as large as 450MB/file.  We are looking at chunking the file up before commiting it to git, but is there any plan to better support chunking of these files during repacks or other operations?  Right now, it appears either the whole file, or the whole collection of files in a commit (not sure which) can need to be resident in memory up to twice, from reading various places on the web.  Our poor 32-bit server is barfing on this.  We are going to put more RAM and a 64bit OS on the machine, but this still seems like an unnecessary design decision.

thanks very much,
-Nick

^ permalink raw reply	[flat|nested] 5+ messages in thread

end of thread, other threads:[~2010-02-25 18:13 UTC | newest]

Thread overview: 5+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2010-02-24 23:00 Roadmap to better handle big files? Nick Triantos
2010-02-24 23:39 ` Nicolas Pitre
2010-02-24 23:51 ` Jakub Narebski
2010-02-25  0:02   ` Nick Triantos
2010-02-25 18:06 ` Joshua Jensen

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).