My dear lemmings,

I discovered Clonezilla a while ago and it still is my main tool to backup and restore the partitions I care about on my computers.

I cannot help but wonder if there are now better, more efficient alternatives or is it still a solid choice? There’s nothing wrong with it, I’m just curious about others’ practices and habits — and if there was newer tools or solutions available.

Thank you for your feedback, and keep your drives safe!

  • Toribor@corndog.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 year ago

    Generally I just don’t take clones of disk partitions anymore. They tend to take up too much disk space to keep more than one or two backups and typically require the disk to be unmounted which means it’s a mostly manual process. That all but guarantees that any backup I take will be out of date when I need it most.

    Instead I’ve found it better to take regular automated file level backups and automate the way I configure my environment so that I can quickly restore and rebuild if something goes wrong.

    If I just want to be able to quickly revert a drive to a previous state or have easy point-in-time restore I manage the disk with ZFS. ZFS has a snapshotting feature which is great for this sort of thing and you can even restore snapshots to another zfs pool the same way you might restore a partition to another disk but without all the hassle of resizing things.

  • const_void@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Also interested in this. Currently in need of an imaging solution that’s less clunky to use than Clonezilla.

    • strax@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      yeah, partclone is the tool that clonezilla uses under the hood. i find that using partclone directly is easier.

  • HouseWolf@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I’ve used Clonezilla recently to clone my main 1tb drive aswell as a 4tb backup drive to an external HDD and both times worked fine.

    It is painfully slow however but I’m not sure I could do anything about that outside of buying faster drives.

  • BCsven@lemmy.ca
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Clonezilla or dd. if you are on GNOME you can use gnome disks and it has a create diak image, restore disk image option, if you want an img file

  • blackstrat@lemmy.fwgx.uk
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    I use clonezilla at work for imaging and deploying laptops. It works like a charm. Great piece of software. It’s not normal backup software though.

  • loie@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Second for Rescuezilla, it’s a Clonezilla front end with sane defaults you’d probably pick anyways.

  • Pantherina@feddit.de
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    1 year ago

    Yes, works great! Used it to clone some windows users stuff, he thought having a dozen partitions makes sense, still no problem at all. Copied everything from HDD to bigger SSD, just worked.

    You download the ISO, flash it to a usb stick (we used rufus, but dd, impression (udisks2 frontent in gtk&rust) or balena etcher should also work). The TUI is usable, has some options but the defaults seem good.

  • Max-P@lemmy.max-p.me
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    The big advantage of Clonezilla or using dd is you make a perfect 1:1 copy of the disk so you’re pretty confident it will restore perfectly, but you need a disk of at least the same size and so on. Also perfect if you’re trying to do file recovery and so on, because even corrupted or entirely unreachable data is still technically on the disk.

    That’s very inefficient when you have say, 5GB used of a 1TB disk, although compression will help a bit. But that’s where more specialized tools comes in: what if we could only backup the actual data, and end up with a 5GB backup before compression.

    That’s useful and nice, but can’t possibly deal with corrupted or deleted files since it’ll just skip over them. The backup is only as good as all the filesystem features the archiver can encode. On Linux, tar has us pretty well covered as long as you only need relatively standard features like owners, groups. If you zip your root Linux partition you’ll end up with broken ownership and permissions, because it doesn’t encode ACLs and xattrs and hardlinks and whatever else. On NTFS, since it’s proprietary, undocumented and a fairly complex filesystem, it’s much riskier. If you backup your game library, you’re probably fine, but if you want Windows to boot after a restore, you need a much more complete backup and if you don’t want to take risks, whole partition backups are much safer. ntfsclone exists but I just don’t trust it like I would trust tar to backup my ext4 partitions correctly.

    So it’s all a tradeoff. Do you want efficiency, or do you want reliability? How much of the information can you lose? Like, if you backup your C: drive on Windows but only care about your files and documents but not the Windows install itself, then it makes sense to just archive the files rather than a block copy.

    So, what do you expect from your backups? The answer to that question also answers this thread.

  • MangoKangaroo@beehaw.org
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    I still use Clonezilla to back up devices before performing reinstalls/major updates (when Timeshift isn’t practical). No issues so far backing up and restoring both Windows and Linux partitions/drives.

  • Gabu@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    The main thing about Clonezilla is that you can always rely on it working, no matter the system. The bad thing is that proprietary solutions have a lot more creature comforts.

  • utopiah@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Others have mentioned rsync and I’d like to suggest on top of rdiff-backup but it’s indeed for files, not partitions or disks. That being said IMHO if you are not managing data-centers and thus swapping entire physical disks by the bucket, you probably don’t want to actually care for disks themselves.

    If you genuinely have to frequently change not just data but entire systems, maybe looking at nix or cloud-init could help.

  • WalrusDragonOnABike [they/them]@reddthat.com
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    1 year ago

    Used it for cloning some laptops recently without much issue. Cloned one laptop’s primary partition onto an SD card and then imaged the others no problem. Laptops were 256GBs capacity (but only like 30-60 GBs used) and the SD card was 64 GBs. Seemed pretty simple to me.

    There’s a lot of options for those who want to do things like deploy over a network, but I haven’t messed with them seriously (I didn’t have the ethernet cables to do it - wasted a bit of time trying before realizing they weren’t connect to a network; maybe there’s a way to connect via wifi, but I didn’t see it)