From mboxrd@z Thu Jan 1 00:00:00 1970 From: Agus Budy Wuysang Subject: Re: Linux Backups Date: Wed, 20 Nov 2002 16:06:57 +0700 Sender: linux-admin-owner@vger.kernel.org Message-ID: <3DDB50B1.2080709@fasw.co.id> References: <20021118114252.B27228@zerodivide.cx> <3DDA7048.60D9D6CC@rosemail.rose.hp.com> <002c01c2900b$21e9ddd0$e1110a0a@ewalsh01> Mime-Version: 1.0 Content-Transfer-Encoding: 7bit Return-path: List-Id: Content-Type: text/plain; charset="us-ascii"; format="flowed" To: Ed Walsh Cc: linux-admin@vger.kernel.org Ed Walsh wrote: > Hi All, > > I'm getting a little fustrated with linux backup. I have about 30GB of data > residing on RedHat linux 7.1 that needs to be backed up. I've done backup > with GNU tar with compression but it only backs up 2GB worth of data onto > another machine. However, if I have 7GB of free space on the same machine, > tar will go over the 2GB limit. So that part confuses me. > > I'm kinda hoping that someone knows a better way of backing up large data. > I've somewhat been reading Amanda but I'm not sure if that's the way to go. > I may be wrong since I'm new to linux backups. > > Any suggestions/recommendations or pointers to go to websites to research on > would be greatly appreciated. Not having much luck on my own. It seems you are stuck with 2Gb filesize limit, if so you can try splitting them into 2Gb chunks, eg: [/bdir] $ tar -cvzf - /data/to/backup | split -b2000m - mybackup.gz. or [/from/anydir] $ tar ... | (cd /bdir; split ... ) It will create mybackup.aa, mybackup.ab, .ac ... etc, into /bdir, w/ size approx 2Gb each. To restore, just do: cat /bdir/mybackup.gz.* | tar -xvzf - more info: man 1 split (from textutils package) Hope this could help you. -- +-R-| Mozilla 1.0.1 Gecko/2002 |-H-| Powered by Linux 2.4.x |-7-+ |/v\ Agus Budy Wuysang MIS Department | | | Phone: +62-21-344-1316 ext 317 GSM: +62-816-1972-051 | +------------| http://www.fasw.co.id/person/supes/ |-------------+