linux filesystem imaging
Results 1 to 5 of 5

Thread: linux filesystem imaging

  1. #1
    Senior Member
    Join Date
    May 2004
    Posts
    274

    linux filesystem imaging

    Hi all,

    is this possible to make unattended filesystem backup (or filesystem imaging) both incremental and full and then transfering the image to a remote machine (ssh or ftp) . The filesystem is ext3 and the OS is redhat 9. I searched myself for the problem and came across dump, restore, rdiff-backup, rsync, i will be trying one of the solutions. Since this would be done on production system therefore it would be up at the same time, i mean both running and imaging at the same time. Would anyone will like to describe his backup strategy mainly using opensource tools.


    Thanks
    Excuse me, is there an airport nearby large enough for a private jet to land?

  2. #2
    Senior Member
    Join Date
    Jan 2003
    Posts
    3,914
    Hey Hey,

    For imaging, I've always liked g4u and/or g4l... Apparently g4l is infringing on g4u's copyright so I'll post from g4u.

    I'm not sure if you could take their utilities and dump them to disk instead but it might be something to play with.

    Source: http://www.feyrer.de/g4u/#imgcreate

    Type "uploaddisk your.ftp.server.com filename.gz" to read out the machine's harddisk (rwd0d), and put it into the "install" account of your FTP server under the given filename. The disk image is compressed (with gzip -9), so maybe use a ".gz" file suffix. You don't have to, though. Before putting the file on the FTP server, the "install" account's password is requested.
    If you want to clone your second IDE disk, add it's name on the uploaddisk command line: "uploaddisk your.ftp.server.com filename.gz wd1". Similarly, if you use SCSI instead of IDE disks, use "uploaddisk your.ftp.server.com filename.gz sd0".
    Normally you'd have to boot the disk in order for this to function, but you could test running it while your OS is running (assuming you have a test server).. it's a great way to create and store a full disk image on a remote server.

    Peace,
    HT
    IT Blog: .:Computer Defense:.
    PnCHd (Pronounced Pinched): Acronym - Point 'n Click Hacked. As in: "That website was pinched" or "The skiddie pinched my computer because I forgot to patch".

  3. #3
    Senior Member
    Join Date
    May 2004
    Posts
    274
    thanks for the quick reply,

    Originally posted here by HTRegz

    Normally you'd have to boot the disk in order for this to function, but you could test running it while your OS is running (assuming you have a test server).. it's a great way to create and store a full disk image on a remote server.

    HT
    how can i test g4u on live system cuz it is made for bootable cd's and floppies and i think it will not work in defferentail, incremental style either it would do a full system image.


    Thanks
    Excuse me, is there an airport nearby large enough for a private jet to land?

  4. #4
    Just Another Geek
    Join Date
    Jul 2002
    Location
    Rotterdam, Netherlands
    Posts
    3,401
    Dump and restore are the standard Unix (-like) backup utilities..

    Want to backup remote?

    Code:
    ssh backup_user@remote.system dump //dev/ad0 -f - | gzip > myremotebackup.gz
    Oliver's Law:
    Experience is something you don't get until just after you need it.

  5. #5
    Senior Member
    Join Date
    May 2004
    Posts
    274
    i have further categorized the scenario like,
    I will be having two types of backups
    1) Incremental backups (daily) this i can accomplish with using tar
    2) Weekly system snapshot i.e. cloning the hard drive so that i can have an option of bare metal recovery.
    I checked g4u and partimage for the second option but there is a problem both require a system downtime i.e. i have to insert a bootable cd into the drive and boot from it and then make a snapshot of the system or atleast the partition which is going to be images should be mounted at that time. The problem is that downtime is impossible since the machine is our main web server and must be up all the time. The other strategy is based on the following steps,
    1) copying the 0 sector of the disk
    2) copying the partition tables
    3) making the tar of the whole file system

    For recovery
    1) restoring the 0 sector
    2) restoring partition tables
    3) untaring the archive

    What are ur thoughts and backup stratagies for such situations.


    Thanks
    Excuse me, is there an airport nearby large enough for a private jet to land?

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •