View previous topic :: View next topic |
Author |
Message |
tuppe666 Guru
Joined: 02 Mar 2004 Posts: 423
|
Posted: Mon May 17, 2004 9:32 pm Post subject: Backup Backup Backup |
|
|
I missundersood catalyst I got all exited I thought a simple way to backup my hard drive programs, my adsl speedtouch modem which if fragile, my ... other bits and bobs that vanished lats time my lost+found contained all my programs and then pants I don't think it does what I really want so as I wondered what is the best was of backing up programs on my hard drive so I can get myself up and running as fast as possible back into gentoo. I am surprised there isn't a standard program!
Love and Kisses
Moi |
|
Back to top |
|
|
spamspam Apprentice
Joined: 05 Dec 2003 Posts: 153
|
Posted: Mon May 17, 2004 9:35 pm Post subject: Standard program? |
|
|
Tar |
|
Back to top |
|
|
NeddySeagoon Administrator
Joined: 05 Jul 2003 Posts: 54422 Location: 56N 3W
|
Posted: Mon May 17, 2004 9:39 pm Post subject: |
|
|
tuppe666,
tar to create the archive and bzip2 to compress it if you need the space. _________________ Regards,
NeddySeagoon
Computer users fall into two groups:-
those that do backups
those that have never had a hard drive fail. |
|
Back to top |
|
|
spamspam Apprentice
Joined: 05 Dec 2003 Posts: 153
|
Posted: Mon May 17, 2004 9:45 pm Post subject: Offline media |
|
|
It might also be a really good idea to copy the tarball to offline media such as a tape, Zip Disk, or CDR/DVDR. |
|
Back to top |
|
|
billkr n00b
Joined: 01 May 2004 Posts: 5 Location: charlottesville, virginia
|
Posted: Tue May 18, 2004 4:56 pm Post subject: |
|
|
Is there a way to make tar break up a large archive into chunks of a specified size? Specifically, can I use tar to break up my photo collection into 700 MB portions to burn onto cd? I have taken a look at man tar, there is a --tape-length option, but I'm not sure if this is what I'm looking for.
Thanks for any help. |
|
Back to top |
|
|
FrithjofEngel n00b
Joined: 28 Feb 2003 Posts: 12
|
Posted: Tue May 18, 2004 7:07 pm Post subject: |
|
|
billkr wrote: | Is there a way to make tar break up a large archive into chunks of a specified size? Specifically, can I use tar to break up my photo collection into 700 MB portions to burn onto cd? I have taken a look at man tar, there is a --tape-length option, but I'm not sure if this is what I'm looking for.
|
It is |
|
Back to top |
|
|
bch n00b
Joined: 05 Mar 2004 Posts: 27
|
|
Back to top |
|
|
tam1138 Tux's lil' helper
Joined: 28 Oct 2003 Posts: 103
|
Posted: Wed May 19, 2004 12:30 am Post subject: |
|
|
The bummer with this method is that you need to reassemble all of the parts in order to retrieve a single file (even if that file is contained completely within a single part). |
|
Back to top |
|
|
spacejock Tux's lil' helper
Joined: 26 Sep 2003 Posts: 94 Location: Australia
|
Posted: Tue Jun 15, 2004 6:06 am Post subject: |
|
|
Tar/bz works, but isn't there a more efficient way?
E.g. Use zip to create an archive of /etc
Use zip -u weekly to add changed files
use zip ??? to remove files no longer present on the source.
Mt FULL-etc-tar.gz files are only 8mb or so, but my FULL-usr.tar.gz is over 2gb. Re-creating that file every week when 99% of it is the same seems like a big waste of time.
I also have an rsync command which duplicates important folders onto a second drive nightly, weekly, monthly, but I use a USB disk to carry the entire system backup with daily increments from 3 servers everywhere I go - about 5gb per server. It's VFAT formatted, don't want to make it ext3, so I can't just rsync files onto it and preserve attributes. Also, I plug this disk in and grab changed tar.bz files, it's not going to sit there for 2 hours waiting to back everything up onto it.
I'm sure there's a way of listing files in the zip, checking if they exist (test -f?) and if not, removing them from the zip. Then add changed files back into the zip to finish off. Daily increments can just start with a new zip. I'll have to start investigating...
Cheers
Simon _________________ Author of the Hal Spacejock series ... "Better than Red Dwarf" - Tom Holt
Download the first book free |
|
Back to top |
|
|
n3mo l33t
Joined: 28 Mar 2004 Posts: 657 Location: In a Cruel World
|
Posted: Tue Jun 15, 2004 8:15 am Post subject: |
|
|
Code: | mkdir /backup
cd /
tar cpvf /backup/filename.tar --directory / --exclude=proc --exclude=mnt --exclude=backup --exclude=dev --exclude=*/lost+found . |
you can put that on a script, except for for the mkdir command and run it weekly, you colud also improve the script using --tape-lenght option or using incremental backup. |
|
Back to top |
|
|
nobspangle Veteran
Joined: 23 Mar 2004 Posts: 1318 Location: Manchester, UK
|
Posted: Tue Jun 15, 2004 9:06 am Post subject: |
|
|
you can't use zip as it won't store important stuff like links and file permissions.
Tar is a very versatile backup program, and will allow you to do incremental backups and split volume backups.
If you're short on space you can also make it bzip2 the archive by simply adding the j option. |
|
Back to top |
|
|
puke Tux's lil' helper
Joined: 05 Oct 2002 Posts: 128
|
Posted: Tue Jun 15, 2004 9:28 am Post subject: |
|
|
Mondo Rescue allows you to build ISOs to image your linux box. |
|
Back to top |
|
|
spacejock Tux's lil' helper
Joined: 26 Sep 2003 Posts: 94 Location: Australia
|
Posted: Tue Jun 15, 2004 10:22 am Post subject: |
|
|
Thanks for the reply.
From man zip:
-y Store symbolic links as such in the zip archive,
instead of compressing and storing the file
referred to by the link (UNIX only).
However, you're right about the file permissions.
I already use tar for incremental backups, and I use the z option (for gz) because it seems to be a lot faster than bz2 compression, even if the files are a bit bigger.
What I was getting at is that my system goes through a massive routine, re-creating several 1gb+ files, when most of the content is the same. I do nightly incrementals, I was just looking for a way to cut out so much activity on my server.
In fact, I wrote a script which delves into the first level directory (e.g. var) and then backs up each subfolder tree to a separate file - this splits some of the very large files up and makes copying the files over the network a bit easier. I guess another option would be do schedule the weekly backup of /var on Monday, /usr on Tuesday, etc.
Cheers
Simon _________________ Author of the Hal Spacejock series ... "Better than Red Dwarf" - Tom Holt
Download the first book free |
|
Back to top |
|
|
Janne Pikkarainen Veteran
Joined: 29 Jul 2003 Posts: 1143 Location: Helsinki, Finland
|
Posted: Tue Jun 15, 2004 10:49 am Post subject: |
|
|
You might also want to take a look at rsync. It transfers only the changed files and with some shell magic it's also possible to do incremental backups with it.
http://rsync.samba.org/examples.html _________________ Yes, I'm the man. Now it's your turn to decide if I meant "Yes, I'm the male." or "Yes, I am the Unix Manual Page.". |
|
Back to top |
|
|
spacejock Tux's lil' helper
Joined: 26 Sep 2003 Posts: 94 Location: Australia
|
Posted: Tue Jun 15, 2004 11:03 am Post subject: |
|
|
"I also have an rsync command which duplicates important folders onto a second drive nightly, weekly, monthly,"
Actually, that should have said 'script', not 'command'. But I'm using rsync, and it's good. I wrote something similar for Windows recently, nowhere near as many features but it allow me to sync important folders onto my backup disks, based on file modification time.
Cheers
Simon _________________ Author of the Hal Spacejock series ... "Better than Red Dwarf" - Tom Holt
Download the first book free |
|
Back to top |
|
|
eelleemmeenntt n00b
Joined: 02 Jul 2004 Posts: 6 Location: London Ont. Canada
|
Posted: Thu Jul 22, 2004 1:26 am Post subject: |
|
|
Hey I have two bash scripts that work differently for backup. One uses rsync and the other rdiff_backup. Feel free to use. I hope they help......
http://publish.uwo.ca/~nelkadri/ _________________ Haunted-Cave |
|
Back to top |
|
|
Kraymer Guru
Joined: 27 Aug 2003 Posts: 349 Location: Germany
|
Posted: Mon Aug 16, 2004 10:00 am Post subject: rdiff-backup |
|
|
Hi there!
I was looking for something similar and don't want to stick to the tar-bzip2-solution. Looking in portage with '-s backup', I found rdiff-backup which looks quite interesting; although your data won't be compressed automatically.
Sebastian |
|
Back to top |
|
|
blscreen Tux's lil' helper
Joined: 04 Mar 2003 Posts: 118 Location: Innsbruck
|
Posted: Mon Aug 16, 2004 12:30 pm Post subject: |
|
|
I use a rdiff-backup server in my small home network and it has saved me several times
rdiff-backup mirrors your backup-data uncompressed. But if a file changes, rdiff-backup saves the new version together with the compressed differences between the versions, so you can restore files from the past.
The idea of not compressing the main mirror is that in case of a complete loss you can use the backup data as a readonly fileserver in seconds.
The drawback is that you need more space for the initial backup, but after that the backuptree wil grow slowly. I have a 12 GB backup partition for user data, and I can keep increments going back several months. |
|
Back to top |
|
|
rzZzn Tux's lil' helper
Joined: 24 Aug 2004 Posts: 96 Location: Sweden
|
Posted: Tue Aug 24, 2004 7:29 am Post subject: |
|
|
I use webmin |
|
Back to top |
|
|
mazirian Apprentice
Joined: 26 Jun 2003 Posts: 273 Location: Yarmouth, ME
|
Posted: Tue Aug 24, 2004 6:18 pm Post subject: |
|
|
I second blscreen's reccomendation of rdiff-backup, that's agreat package.
cpio is also a useful tool also. From what i understand, it may have some advantages over tar if anything gets corrupted. That may be bs, I don't know.
I just started using it to back up my home directory with the following:
Quote: |
find /home/bowman | cpio -o -Hnewc | bzip2 > perciplex-home-bowman-$(date +%F).cpio.bz2
|
|
|
Back to top |
|
|
|