git.vger.kernel.org archive mirror
 help / color / mirror / Atom feed
* out of memory error with git push and pull
@ 2011-06-01 22:33 Qingning Huo
  2011-06-02  4:46 ` Joey Hess
  0 siblings, 1 reply; 3+ messages in thread
From: Qingning Huo @ 2011-06-01 22:33 UTC (permalink / raw)
  To: git

Hi All,

I tried to use git to manage my digital photos but encountered some
problems. The typical file sizes are hundreds of KB or a few MB. In
total, about 15GB data in about 10,000 files. My intention is to get
them into a git repository and cloned into a few computers. Probably I
will make some occasionally changes like editing and deleting. But I
think most of the files would stay at version one.

The setup I used is a centralized repository for pulling from and
pushing into. There is a gitweb interface for the centralised repo.

I started by creating a small repository on the server (ubuntu), and
keeps pushing data into it from a windows machine (using cygwin). Half
way through the process, (after pushed about 8GB of data), I found
that I cannot run git push any more. This is the error message I got:

$ git push
Counting objects: 621, done.
Delta compression using up to 4 threads.
fatal: Out of memory? mmap failed: Cannot allocate memory
error: pack-objects died with strange error
error: failed to push some refs to 'ssh://huo@ubuntu/mnt/share/git/photo.git'

At the same time, I found that I cannot pull from this repository either.

$ git pull
remote: Counting objects: 8088, done.
error: pack-objects died of signal 983/8057)
error: git upload-pack: git-pack-objects died with error.
fatal: git upload-pack: aborting due to possible repository corruption
on the remote side.
remote: aborting due to possible repository corruption on the remote side.
fatal: protocol error: bad pack header

[The second line is probably "died of signal 9", because it was
counting ???/8057 upwards before the crash.]

I wonder whether anyone has tried using git in a similar scenario. Is
git capable of handling this kind of data? And, are there any settings
and/or command line options that I should use? I had a quick look of
git help push (and pull/fetch) but cannot see anything obvious.

BTW, I am using git version 1.7.0.4 on the ubuntu server, and version
1.7.2.3 for cygwin on the client side.

Thanks in advance.

Qingning

^ permalink raw reply	[flat|nested] 3+ messages in thread

* Re: out of memory error with git push and pull
  2011-06-01 22:33 out of memory error with git push and pull Qingning Huo
@ 2011-06-02  4:46 ` Joey Hess
  2011-06-02 22:13   ` Qingning Huo
  0 siblings, 1 reply; 3+ messages in thread
From: Joey Hess @ 2011-06-02  4:46 UTC (permalink / raw)
  To: Qingning Huo; +Cc: git

[-- Attachment #1: Type: text/plain, Size: 1373 bytes --]

Qingning Huo wrote:
> I tried to use git to manage my digital photos but encountered some
> problems. The typical file sizes are hundreds of KB or a few MB. In
> total, about 15GB data in about 10,000 files. My intention is to get
> them into a git repository and cloned into a few computers. Probably I
> will make some occasionally changes like editing and deleting. But I
> think most of the files would stay at version one.

I try not to mention git-annex too much here, but this is a perfect
use-case for it. http://git-annex.branchable.com/ 

Well, it would be more perfect if you had enough data in your repo that
you didn't necessarily want to clone it all to every computer. Like so:

# git annex status
local annex size: 58 megabytes
total annex keys: 38158
total annex size: 6 terabytes

:)

> I wonder whether anyone has tried using git in a similar scenario. Is
> git capable of handling this kind of data? And, are there any settings
> and/or command line options that I should use? I had a quick look of
> git help push (and pull/fetch) but cannot see anything obvious.

There is a tunable you can use to improve things, see core.bigFileThreshold

That originally came from this project.
http://caca.zoy.org/wiki/git-bigfiles -- it may have some other
improvements that have not landed in git, I'm not sure.

-- 
see shy jo

[-- Attachment #2: Digital signature --]
[-- Type: application/pgp-signature, Size: 828 bytes --]

^ permalink raw reply	[flat|nested] 3+ messages in thread

* Re: out of memory error with git push and pull
  2011-06-02  4:46 ` Joey Hess
@ 2011-06-02 22:13   ` Qingning Huo
  0 siblings, 0 replies; 3+ messages in thread
From: Qingning Huo @ 2011-06-02 22:13 UTC (permalink / raw)
  To: Joey Hess; +Cc: git

Hi Joey,

On Thu, Jun 2, 2011 at 5:46 AM, Joey Hess <joey@kitenet.net> wrote:
> Qingning Huo wrote:
>> I tried to use git to manage my digital photos but encountered some
>> problems. The typical file sizes are hundreds of KB or a few MB. In
>> total, about 15GB data in about 10,000 files. My intention is to get
>> them into a git repository and cloned into a few computers. Probably I
>> will make some occasionally changes like editing and deleting. But I
>> think most of the files would stay at version one.
>
> I try not to mention git-annex too much here, but this is a perfect
> use-case for it. http://git-annex.branchable.com/
>
> Well, it would be more perfect if you had enough data in your repo that
> you didn't necessarily want to clone it all to every computer. Like so:
>
> # git annex status
> local annex size: 58 megabytes
> total annex keys: 38158
> total annex size: 6 terabytes
>
> :)

Thanks a lot for the pointer. I'd love to use git-annex if I can get
my hand to it.
I had a look of the web site and searched a bit on the web, but there does not
seem to be an easy way to install it on windows/cygwin.

I might try the bigFileThreshold setting first. And maybe git-bigfiles.

>
>> I wonder whether anyone has tried using git in a similar scenario. Is
>> git capable of handling this kind of data? And, are there any settings
>> and/or command line options that I should use? I had a quick look of
>> git help push (and pull/fetch) but cannot see anything obvious.
>
> There is a tunable you can use to improve things, see core.bigFileThreshold
>
> That originally came from this project.
> http://caca.zoy.org/wiki/git-bigfiles -- it may have some other
> improvements that have not landed in git, I'm not sure.
>
> --
> see shy jo
>

Thanks
Qingning

^ permalink raw reply	[flat|nested] 3+ messages in thread

end of thread, other threads:[~2011-06-02 22:13 UTC | newest]

Thread overview: 3+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2011-06-01 22:33 out of memory error with git push and pull Qingning Huo
2011-06-02  4:46 ` Joey Hess
2011-06-02 22:13   ` Qingning Huo

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).