* [FYI] very large text files and their problems.
@ 2012-02-22 15:49 Ian Kumlien
2012-02-22 16:18 ` Nguyen Thai Ngoc Duy
0 siblings, 1 reply; 6+ messages in thread
From: Ian Kumlien @ 2012-02-22 15:49 UTC (permalink / raw)
To: git
Hi,
We just saw a interesting issue, git compressed a ~3.4 gb project to ~57
mb. But when we tried to clone it on a big machine we got:
fatal: Out of memory, malloc failed (tried to allocate
18446744072724798634 bytes)
This is already fixed in the 1.7.10 mainline - but it also seems like
git needs to have atleast the same ammount of memory as the largest
file free... Couldn't this be worked around?
On a (32 bit) machine with 4GB memory - results in:
fatal: Out of memory, malloc failed (tried to allocate 3310214313 bytes)
(and i see how this could be a problem, but couldn't it be mitigated? or
is it bydesign and intended behaviour?)
I'm not subscribed to please keep me in CC.
/Ian Kumlien
^ permalink raw reply [flat|nested] 6+ messages in thread
* Re: [FYI] very large text files and their problems.
2012-02-22 15:49 [FYI] very large text files and their problems Ian Kumlien
@ 2012-02-22 16:18 ` Nguyen Thai Ngoc Duy
2012-02-24 10:11 ` Ian Kumlien
0 siblings, 1 reply; 6+ messages in thread
From: Nguyen Thai Ngoc Duy @ 2012-02-22 16:18 UTC (permalink / raw)
To: Ian Kumlien; +Cc: git
On Wed, Feb 22, 2012 at 10:49 PM, Ian Kumlien <pomac@vapor.com> wrote:
> Hi,
>
> We just saw a interesting issue, git compressed a ~3.4 gb project to ~57 mb.
How big are those files? How many of them? How often do they change?
> But when we tried to clone it on a big machine we got:
>
> fatal: Out of memory, malloc failed (tried to allocate
> 18446744072724798634 bytes)
>
> This is already fixed in the 1.7.10 mainline - but it also seems like
Does 1.7.9 have this problem?
> git needs to have atleast the same ammount of memory as the largest
> file free... Couldn't this be worked around?
>
> On a (32 bit) machine with 4GB memory - results in:
> fatal: Out of memory, malloc failed (tried to allocate 3310214313 bytes)
>
> (and i see how this could be a problem, but couldn't it be mitigated? or
> is it bydesign and intended behaviour?)
I think that it's delta resolving that hogs all your memory. If your
files are smaller than 512M, try lower core.bigFileThreshold. The
topic jc/split-blob, which stores a big file are several smaller
pieces, might solve your problem. Unfortunately the topic is not
complete yet.
--
Duy
^ permalink raw reply [flat|nested] 6+ messages in thread
* Re: [FYI] very large text files and their problems.
2012-02-22 16:18 ` Nguyen Thai Ngoc Duy
@ 2012-02-24 10:11 ` Ian Kumlien
2012-02-24 11:14 ` Nguyen Thai Ngoc Duy
0 siblings, 1 reply; 6+ messages in thread
From: Ian Kumlien @ 2012-02-24 10:11 UTC (permalink / raw)
To: Nguyen Thai Ngoc Duy; +Cc: git
I'm uncertain if you got my reply since i did it out of bounds - so i'll
repeat myself - sorry... =)
On Wed, Feb 22, 2012 at 11:18:19PM +0700, Nguyen Thai Ngoc Duy wrote:
> On Wed, Feb 22, 2012 at 10:49 PM, Ian Kumlien <pomac@vapor.com> wrote:
> > Hi,
> >
> > We just saw a interesting issue, git compressed a ~3.4 gb project to ~57 mb.
>
> How big are those files? How many of them? How often do they change?
This was the first check in, there is no deltas yet.
The file in question is ~3.3 gb in size - ie exactly: 3310214313 bytes
(as seen below in the malloc failure)
git show <blob sha1 id> |wc -c gives the same exact result.
> > But when we tried to clone it on a big machine we got:
> >
> > fatal: Out of memory, malloc failed (tried to allocate
> > 18446744072724798634 bytes)
> >
> > This is already fixed in the 1.7.10 mainline - but it also seems like
>
> Does 1.7.9 have this problem?
Only tested 1.7.8 and 1.7.9.1 - works in mainline git (pre-1.7.10)
> > git needs to have atleast the same ammount of memory as the largest
> > file free... Couldn't this be worked around?
> >
> > On a (32 bit) machine with 4GB memory - results in:
> > fatal: Out of memory, malloc failed (tried to allocate 3310214313 bytes)
> >
> > (and i see how this could be a problem, but couldn't it be mitigated? or
> > is it bydesign and intended behaviour?)
>
> I think that it's delta resolving that hogs all your memory. If your
> files are smaller than 512M, try lower core.bigFileThreshold. The
> topic jc/split-blob, which stores a big file are several smaller
> pieces, might solve your problem. Unfortunately the topic is not
> complete yet.
Well, in this case it's just stream unpacking gzip data to disk, i
understand if delta would be a problem... But wouldn't delta be a
problem in the sence of <size_of_change>+<size_of_subdata>+<result> ?
Ie, if the file is mmapped - it shouldn't have to be allocated, right?
> --
> Duy
^ permalink raw reply [flat|nested] 6+ messages in thread
* Re: [FYI] very large text files and their problems.
2012-02-24 10:11 ` Ian Kumlien
@ 2012-02-24 11:14 ` Nguyen Thai Ngoc Duy
2012-02-24 12:55 ` Ian Kumlien
0 siblings, 1 reply; 6+ messages in thread
From: Nguyen Thai Ngoc Duy @ 2012-02-24 11:14 UTC (permalink / raw)
To: Ian Kumlien; +Cc: git
On Fri, Feb 24, 2012 at 5:11 PM, Ian Kumlien <pomac@vapor.com> wrote:
> I'm uncertain if you got my reply since i did it out of bounds - so i'll
> repeat myself - sorry... =)
yes I received it, just too busy this week.
>> > git needs to have atleast the same ammount of memory as the largest
>> > file free... Couldn't this be worked around?
>> >
>> > On a (32 bit) machine with 4GB memory - results in:
>> > fatal: Out of memory, malloc failed (tried to allocate 3310214313 bytes)
>> >
>> > (and i see how this could be a problem, but couldn't it be mitigated? or
>> > is it bydesign and intended behaviour?)
>>
>> I think that it's delta resolving that hogs all your memory. If your
>> files are smaller than 512M, try lower core.bigFileThreshold. The
>> topic jc/split-blob, which stores a big file are several smaller
>> pieces, might solve your problem. Unfortunately the topic is not
>> complete yet.
>
> Well, in this case it's just stream unpacking gzip data to disk, i
> understand if delta would be a problem... But wouldn't delta be a
> problem in the sence of <size_of_change>+<size_of_subdata>+<result> ?
>
> Ie, if the file is mmapped - it shouldn't have to be allocated, right?
We should not delta large files. I was worried that the large file
check could go wrong, But I guess your blob's not deltified in this
case.
When you receive a pack during a clone, the pack is streamed to
index-pack, not mmapped, and index-pack checks every object in there
in uncompressed form. I think I have found a way to avoid allocating
that much. Need some more check, then send out.
--
Duy
^ permalink raw reply [flat|nested] 6+ messages in thread
* Re: [FYI] very large text files and their problems.
2012-02-24 11:14 ` Nguyen Thai Ngoc Duy
@ 2012-02-24 12:55 ` Ian Kumlien
0 siblings, 0 replies; 6+ messages in thread
From: Ian Kumlien @ 2012-02-24 12:55 UTC (permalink / raw)
To: Nguyen Thai Ngoc Duy; +Cc: git
On Fri, Feb 24, 2012 at 06:14:46PM +0700, Nguyen Thai Ngoc Duy wrote:
> On Fri, Feb 24, 2012 at 5:11 PM, Ian Kumlien <pomac@vapor.com> wrote:
> > I'm uncertain if you got my reply since i did it out of bounds - so i'll
> > repeat myself - sorry... =)
>
> yes I received it, just too busy this week.
Ah good, you never know what anti-spam measures people applies these
days... =)
And i have the same, so i totally understand.
> >> > git needs to have atleast the same ammount of memory as the largest
> >> > file free... Couldn't this be worked around?
> >> >
> >> > On a (32 bit) machine with 4GB memory - results in:
> >> > fatal: Out of memory, malloc failed (tried to allocate 3310214313 bytes)
> >> >
> >> > (and i see how this could be a problem, but couldn't it be mitigated? or
> >> > is it bydesign and intended behaviour?)
> >>
> >> I think that it's delta resolving that hogs all your memory. If your
> >> files are smaller than 512M, try lower core.bigFileThreshold. The
> >> topic jc/split-blob, which stores a big file are several smaller
> >> pieces, might solve your problem. Unfortunately the topic is not
> >> complete yet.
> >
> > Well, in this case it's just stream unpacking gzip data to disk, i
> > understand if delta would be a problem... But wouldn't delta be a
> > problem in the sence of <size_of_change>+<size_of_subdata>+<result> ?
> >
> > Ie, if the file is mmapped - it shouldn't have to be allocated, right?
>
> We should not delta large files. I was worried that the large file
> check could go wrong, But I guess your blob's not deltified in this
> case.
That would be correct
> When you receive a pack during a clone, the pack is streamed to
> index-pack, not mmapped, and index-pack checks every object in there
> in uncompressed form. I think I have found a way to avoid allocating
> that much. Need some more check, then send out.
Ah! That explains alot - do you have a publicly available version i
could look at?
> --
> Duy
^ permalink raw reply [flat|nested] 6+ messages in thread
* Re: [FYI] very large text files and their problems.
@ 2012-02-22 18:39 Ian Kumlien
0 siblings, 0 replies; 6+ messages in thread
From: Ian Kumlien @ 2012-02-22 18:39 UTC (permalink / raw)
To: git; +Cc: pclouds
[-- Attachment #1: Type: text/plain, Size: 1828 bytes --]
Seems like i ruined my dovecot config in a recent upgrade - which also
affected my mail... =/
Anyway, it's all fixed now.
from: Nguyen Thai Ngoc Duy <pclouds () gmail ! com>
> On Wed, Feb 22, 2012 at 10:49 PM, Ian Kumlien <pomac@vapor.com> wrote:
> > Hi,
> >
> > We just saw a interesting issue, git compressed a ~3.4 gb project to
> ~57 mb.
>
> How big are those files? How many of them? How often do they change?
This is the initial check in, one of the files is a 3.3 gb text file.
> > But when we tried to clone it on a big machine we got:
> >
> > fatal: Out of memory, malloc failed (tried to allocate
> > 18446744072724798634 bytes)
> >
> > This is already fixed in the 1.7.10 mainline - but it also seems
> like
>
> Does 1.7.9 have this problem?
I've tested with 1.7.9.1, haven't downgraded to test with 1.7.9...
> > git needs to have atleast the same ammount of memory as the largest
> > file free... Couldn't this be worked around?
> >
> > On a (32 bit) machine with 4GB memory - results in:
> > fatal: Out of memory, malloc failed (tried to allocate 3310214313
> bytes)
> >
> > (and i see how this could be a problem, but couldn't it be
> mitigated? or
> > is it bydesign and intended behaviour?)
>
> I think that it's delta resolving that hogs all your memory. If your
> files are smaller than 512M, try lower core.bigFileThreshold. The
> topic jc/split-blob, which stores a big file are several smaller
> pieces, might solve your problem. Unfortunately the topic is not
> complete yet.
the problem here is that there is one file that is exactly: 3310214313
bytes, so it should all be one "blob".
split-blob would be really interesting for several reasons though =)
> --
> Duy
> --
--
Ian Kumlien -- http://demius.net || http://pomac.netswarm.net
[-- Attachment #2: This is a digitally signed message part --]
[-- Type: application/pgp-signature, Size: 198 bytes --]
^ permalink raw reply [flat|nested] 6+ messages in thread
end of thread, other threads:[~2012-02-24 12:56 UTC | newest]
Thread overview: 6+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2012-02-22 15:49 [FYI] very large text files and their problems Ian Kumlien
2012-02-22 16:18 ` Nguyen Thai Ngoc Duy
2012-02-24 10:11 ` Ian Kumlien
2012-02-24 11:14 ` Nguyen Thai Ngoc Duy
2012-02-24 12:55 ` Ian Kumlien
-- strict thread matches above, loose matches on Subject: below --
2012-02-22 18:39 Ian Kumlien
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).