linux-admin.vger.kernel.org archive mirror
 help / color / mirror / Atom feed
* zip error: out of memory
@ 2004-07-28 15:35 Luca Ferrari
  2004-07-28 19:40 ` Chris DiTrani
  2004-07-28 20:17 ` chuck gelm
  0 siblings, 2 replies; 8+ messages in thread
From: Luca Ferrari @ 2004-07-28 15:35 UTC (permalink / raw)
  To: linux-admin

Hi,
I was doing a backup of a NFS partition (about 300 MB) using zip but I got the 
error:
Zip error: Out of memory (allocating temp filename)
I've tried to find some information but I was unable. Since I've used zip to 
compress SMB partitions over 1 GB I don't believe that it can be a "size" 
problem. I've tried to specify the temporary directory with the -b flag, but 
nothing changed. On my disk I've got about 11 GB of free space, so have you 
any idea or suggestion about this problem?

Thanks,
Luca

-- 
Luca Ferrari,
fluca1978@infinito.it

^ permalink raw reply	[flat|nested] 8+ messages in thread

* Re: zip error: out of memory
  2004-07-28 15:35 zip error: out of memory Luca Ferrari
@ 2004-07-28 19:40 ` Chris DiTrani
  2004-07-29 17:06   ` Glynn Clements
  2004-07-28 20:17 ` chuck gelm
  1 sibling, 1 reply; 8+ messages in thread
From: Chris DiTrani @ 2004-07-28 19:40 UTC (permalink / raw)
  To: fluca1978; +Cc: linux-admin

On Wed, 2004-07-28 at 11:35, Luca Ferrari wrote:
> Hi,
> I was doing a backup of a NFS partition (about 300 MB) using zip but I got the 
> error:
> Zip error: Out of memory (allocating temp filename)

All assuming you are running recent stuff...

Either this code is failing:

 tempzip = malloc(4);
    if (tempzip == NULL) {
      ZIPERR(ZE_MEM, "allocating temp filename");
    }

or this code (both from info-zip's zip.c source).

if ((tempzip = tempname(zipfile)) == NULL) {
      ZIPERR(ZE_MEM, "allocating temp filename");
    }

On Linux (if I'm preprocessing the MANY #if(n)defs in info-zip's
fileio.c properly) tempname() runs mktemp() after malloc'ing up a 12
byte buffer. Either malloc is failing, or mktemp() is returning a null
pointer.

But here's the calling code:

...
if ((t = malloc(12)) == NULL)
      return NULL;
    *t = 0;
...
 strcat(t, "ziXXXXXX"); /* must use lowercase for Linux dos file system
*/
  return mktemp(t);



And here's the glibc mktemp() code:

mktemp (template)
     char *template;
{
  if (__gen_tempname (template, __GT_NOCREATE) < 0)
    /* We return the null string if we can't find a unique file name. 
*/
    template[0] = '\0';
                                                                                
  return template;
}



So I can't see how tempname() returns a NULL pointer under any
circumstance other than malloc failing. Which could be a side-effect of
the heap getting buggered by other code, or maybe you really are out of
mem. Looks like zip allocates mem for every file it finds and only frees
it after it's done, so it's not the size of the backup set so much as it
is the number of files that impact memory use.

If you can build the source I'd put some printf() calls in zip.c and
especially filio.c and run again.


CD


^ permalink raw reply	[flat|nested] 8+ messages in thread

* Re: zip error: out of memory
  2004-07-28 15:35 zip error: out of memory Luca Ferrari
  2004-07-28 19:40 ` Chris DiTrani
@ 2004-07-28 20:17 ` chuck gelm
  2004-07-29 13:11   ` Luca Ferrari
  1 sibling, 1 reply; 8+ messages in thread
From: chuck gelm @ 2004-07-28 20:17 UTC (permalink / raw)
  To: fluca1978; +Cc: linux-admin

Luca Ferrari wrote:
> Hi,
> I was doing a backup of a NFS partition (about 300 MB) using zip but I got the 
> error:
> Zip error: Out of memory (allocating temp filename)
> I've tried to find some information but I was unable. Since I've used zip to 
> compress SMB partitions over 1 GB I don't believe that it can be a "size" 
> problem. I've tried to specify the temporary directory with the -b flag, but 
> nothing changed. On my disk I've got about 11 GB of free space, so have you 
> any idea or suggestion about this problem?
> 
> Thanks,
> Luca
> 
Hi, Luca:

  Show us exactly what you did (the exact command line).
Show us the free space of your 'cwd' and your temporary directory.
Show us the size of the partition you are trying to backup.

Perhaps you could run 'watch -d df' on another console and see
what directory is getting 'Out of memory' or
  your could run 'free -s 9' on another console to see if you
  are running out of RAM memory.

Hope this helps, Chuck


^ permalink raw reply	[flat|nested] 8+ messages in thread

* Re: zip error: out of memory
  2004-07-28 20:17 ` chuck gelm
@ 2004-07-29 13:11   ` Luca Ferrari
  2004-07-29 14:20     ` Chris DiTrani
  2004-07-29 21:57     ` chuck gelm
  0 siblings, 2 replies; 8+ messages in thread
From: Luca Ferrari @ 2004-07-29 13:11 UTC (permalink / raw)
  To: linux-admin

On Wednesday 28 July 2004 22:17 chuck gelm's cat walking on the keyboard  
wrote:

> Luca Ferrari wrote:
> > Hi,
> > I was doing a backup of a NFS partition (about 300 MB) using zip but I
> > got the error:
> > Zip error: Out of memory (allocating temp filename)
> > I've tried to find some information but I was unable. Since I've used zip
> > to compress SMB partitions over 1 GB I don't believe that it can be a
> > "size" problem. I've tried to specify the temporary directory with the -b
> > flag, but nothing changed. On my disk I've got about 11 GB of free space,
> > so have you any idea or suggestion about this problem?
> >
> > Thanks,
> > Luca
>
> Hi, Luca:
>
>   Show us exactly what you did (the exact command line).
> Show us the free space of your 'cwd' and your temporary directory.
> Show us the size of the partition you are trying to backup.
>
> Perhaps you could run 'watch -d df' on another console and see
> what directory is getting 'Out of memory' or
>   your could run 'free -s 9' on another console to see if you
>   are running out of RAM memory.
>

Here's the command line I use:

/usr/bin/zip  -r -u -y -b /tmp/backup  /mnt/disco2//letizia.zip * 

zip error: Out of memory (allocating temp filename)

and the memory as reported by free before and during the zipping:

             total       used       free     shared    buffers     cached
Mem:        256624     253180       3444          0      40412      85796
-/+ buffers/cache:     126972     129652
Swap:      1020088         16    1020072

             total       used       free     shared    buffers     cached
Mem:        256624     254096       2528          0      34604      92080
-/+ buffers/cache:     127412     129212
Swap:      1020088         16    1020072


Any idea?

Luca




-- 
Luca Ferrari,
fluca1978@infinito.it

^ permalink raw reply	[flat|nested] 8+ messages in thread

* Re: zip error: out of memory
  2004-07-29 13:11   ` Luca Ferrari
@ 2004-07-29 14:20     ` Chris DiTrani
  2004-07-29 21:57     ` chuck gelm
  1 sibling, 0 replies; 8+ messages in thread
From: Chris DiTrani @ 2004-07-29 14:20 UTC (permalink / raw)
  To: fluca1978; +Cc: linux-admin

On Thu, 2004-07-29 at 09:11, Luca Ferrari wrote:

> 
> Here's the command line I use:
> 
> /usr/bin/zip  -r -u -y -b /tmp/backup  /mnt/disco2//letizia.zip * 
> 
> zip error: Out of memory (allocating temp filename)
> 
> and the memory as reported by free before and during the zipping:
> 
>              total       used       free     shared    buffers     cached
> Mem:        256624     253180       3444          0      40412      85796
> -/+ buffers/cache:     126972     129652
> Swap:      1020088         16    1020072
> 
>              total       used       free     shared    buffers     cached
> Mem:        256624     254096       2528          0      34604      92080
> -/+ buffers/cache:     127412     129212
> Swap:      1020088         16    1020072
> 

Nothing unusual here.

I had misread your original post as zipping from an NTFS partition. If I
hadn't I would have said that the interaction with NFS is probably the
root cause of your trouble and you should use a different strategy.

As a quick work-a-round you could use tar or rsync to get a copy of the
nfs mount on local disk and zip it from there. I've done both for much
larger NFS mounts without trouble.

A more ideal strategy might be to do the zip on the machine that hosts
the nfs mount and transfer the zip file via ftp or rsync.

FWIW, my strategy for backup is to have a backup machine running rsync
as a service. Any machine that has data to backup does a:

rsync -az --delete /local-tree-to-backup/ \
rsync://backup-machine/backup-dir

This creates a snapshot of the backup source on the backup machine. I
then compress the snapshot and store it, leaving the snapshot in place
so that the next time a machine does the rsync maneuver only the diffs
are sent, making the process much more efficient on an ongoing basis.
The down side is you need disk space to keep the snapshots around, but
even if you don't, rsync with the -z option compresses the data before
sending it over the network, so it's much better than grabbing the whole
thing uncompressed over NFS.


CD



^ permalink raw reply	[flat|nested] 8+ messages in thread

* Re: zip error: out of memory
  2004-07-28 19:40 ` Chris DiTrani
@ 2004-07-29 17:06   ` Glynn Clements
  0 siblings, 0 replies; 8+ messages in thread
From: Glynn Clements @ 2004-07-29 17:06 UTC (permalink / raw)
  To: linux-admin; +Cc: fluca1978


Chris DiTrani wrote:

> > I was doing a backup of a NFS partition (about 300 MB) using zip but I got the 
> > error:
> > Zip error: Out of memory (allocating temp filename)

> So I can't see how tempname() returns a NULL pointer under any
> circumstance other than malloc failing. Which could be a side-effect of
> the heap getting buggered by other code, or maybe you really are out of
> mem. Looks like zip allocates mem for every file it finds and only frees
> it after it's done, so it's not the size of the backup set so much as it
> is the number of files that impact memory use.

That makes sense.

The zip format stores all file information in a directory at the end
of the file. The size of the directory will be proportional to the
number of files in the archive, and I suspect that it will store this
information in memory rather than a temporary file (most zip files
don't contain an entire filesystem).

One solution would be to create a .tar.gz (or .tar.bz2) file instead. 
A tar file stores each file's information in a header preceding the
file's contents, so there is no need to accumulate the information in
memory.

OTOH, one disadvantage of tar files is that listing the contents
requires scanning the entire file, whereas listing a zip file only
involves seeking to the directory then listing it (although this
requires that the zip file is stored on a medium which supports random
access, i.e. not on tape).

-- 
Glynn Clements <glynn.clements@virgin.net>

^ permalink raw reply	[flat|nested] 8+ messages in thread

* Re: zip error: out of memory
  2004-07-29 13:11   ` Luca Ferrari
  2004-07-29 14:20     ` Chris DiTrani
@ 2004-07-29 21:57     ` chuck gelm
  2004-07-30  8:50       ` Luca Ferrari
  1 sibling, 1 reply; 8+ messages in thread
From: chuck gelm @ 2004-07-29 21:57 UTC (permalink / raw)
  To: fluca1978; +Cc: linux-admin

Luca Ferrari wrote:
> On Wednesday 28 July 2004 22:17 chuck gelm's cat walking on the keyboard  
> wrote:
> 
> 
>>Luca Ferrari wrote:
>>
>>>Hi,
>>>I was doing a backup of a NFS partition (about 300 MB) using zip but I
>>>got the error:
>>>Zip error: Out of memory (allocating temp filename)
>>>I've tried to find some information but I was unable. Since I've used zip
>>>to compress SMB partitions over 1 GB I don't believe that it can be a
>>>"size" problem. I've tried to specify the temporary directory with the -b
>>>flag, but nothing changed. On my disk I've got about 11 GB of free space,
>>>so have you any idea or suggestion about this problem?
>>>
>>>Thanks,
>>>Luca
>>
>>Hi, Luca:
>>
>>  Show us exactly what you did (the exact command line).
>>Show us the free space of your 'cwd' and your temporary directory.
>>Show us the size of the partition you are trying to backup.
>>
>>Perhaps you could run 'watch -d df' on another console and see
>>what directory is getting 'Out of memory' or
>>  your could run 'free -s 9' on another console to see if you
>>  are running out of RAM memory.
>>
> 
> 
> Here's the command line I use:
> 
> /usr/bin/zip  -r -u -y -b /tmp/backup  /mnt/disco2//letizia.zip * 
> 
> zip error: Out of memory (allocating temp filename)
> 
> and the memory as reported by free before and during the zipping:
> 
>              total       used       free     shared    buffers     cached
> Mem:        256624     253180       3444          0      40412      85796
> -/+ buffers/cache:     126972     129652
> Swap:      1020088         16    1020072
> 
>              total       used       free     shared    buffers     cached
> Mem:        256624     254096       2528          0      34604      92080
> -/+ buffers/cache:     127412     129212
> Swap:      1020088         16    1020072
> 
> 
> Any idea?
> 
> Luca

Yes.
Is '/tmp/backup' a directory or a regular file?

If the directory '/tmp' already exists,
  try that command line, but swap '/tmp/backup' with '/tmp'.
I am guessing that the directory 'backup' does not already
exist under '/tmp' or you cannot create the directory
'/tmp/backup' because it already exists as a regular file
or you cannot write to the directory.

What is the free space of '/tmp'?
Show us the output of 'df'.
Do you have 'rights' to create files under '/tmp'?
  (are you running 'zip' as 'root' ?)
Make sure that you have correct access rights to '/tmp/backup'.

HTH, Chuck


^ permalink raw reply	[flat|nested] 8+ messages in thread

* Re: zip error: out of memory
  2004-07-29 21:57     ` chuck gelm
@ 2004-07-30  8:50       ` Luca Ferrari
  0 siblings, 0 replies; 8+ messages in thread
From: Luca Ferrari @ 2004-07-30  8:50 UTC (permalink / raw)
  To: linux-admin

On Thursday 29 July 2004 23:57 chuck gelm's cat walking on the keyboard  
wrote:

> Luca Ferrari wrote:
> > On Wednesday 28 July 2004 22:17 chuck gelm's cat walking on the keyboard
> >
> > wrote:
> >>Luca Ferrari wrote:
> >>>Hi,
> >>>I was doing a backup of a NFS partition (about 300 MB) using zip but I
> >>>got the error:
> >>>Zip error: Out of memory (allocating temp filename)
> >>>I've tried to find some information but I was unable. Since I've used
> >>> zip to compress SMB partitions over 1 GB I don't believe that it can be
> >>> a "size" problem. I've tried to specify the temporary directory with
> >>> the -b flag, but nothing changed. On my disk I've got about 11 GB of
> >>> free space, so have you any idea or suggestion about this problem?
> >>>
> >>>Thanks,
> >>>Luca
> >>
> >>Hi, Luca:
> >>
> >>  Show us exactly what you did (the exact command line).
> >>Show us the free space of your 'cwd' and your temporary directory.
> >>Show us the size of the partition you are trying to backup.
> >>
> >>Perhaps you could run 'watch -d df' on another console and see
> >>what directory is getting 'Out of memory' or
> >>  your could run 'free -s 9' on another console to see if you
> >>  are running out of RAM memory.
> >
> > Here's the command line I use:
> >
> > /usr/bin/zip  -r -u -y -b /tmp/backup  /mnt/disco2//letizia.zip *
> >
> > zip error: Out of memory (allocating temp filename)
> >
> > and the memory as reported by free before and during the zipping:
> >
> >              total       used       free     shared    buffers     cached
> > Mem:        256624     253180       3444          0      40412      85796
> > -/+ buffers/cache:     126972     129652
> > Swap:      1020088         16    1020072
> >
> >              total       used       free     shared    buffers     cached
> > Mem:        256624     254096       2528          0      34604      92080
> > -/+ buffers/cache:     127412     129212
> > Swap:      1020088         16    1020072
> >
> >
> > Any idea?
> >
> > Luca
>
> Yes.
> Is '/tmp/backup' a directory or a regular file?
>

Ops! It was a file. I've changed it to a directory and granted write 
permissions and now it seems to work.
Thanks a lot.

Luca

-- 
Luca Ferrari,
fluca1978@infinito.it

^ permalink raw reply	[flat|nested] 8+ messages in thread

end of thread, other threads:[~2004-07-30  8:50 UTC | newest]

Thread overview: 8+ messages (download: mbox.gz follow: Atom feed
-- links below jump to the message on this page --
2004-07-28 15:35 zip error: out of memory Luca Ferrari
2004-07-28 19:40 ` Chris DiTrani
2004-07-29 17:06   ` Glynn Clements
2004-07-28 20:17 ` chuck gelm
2004-07-29 13:11   ` Luca Ferrari
2004-07-29 14:20     ` Chris DiTrani
2004-07-29 21:57     ` chuck gelm
2004-07-30  8:50       ` Luca Ferrari

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for NNTP newsgroup(s).