For those of you running Unraid and backing up their data to Backblaze, how are you doing it?

I’ve been playing a bit with KopiaUI but what is the easiest and most straight forward way?

Bonus points if I can have the same “client/software/utility” backup data from non-servers (Windows, macOS and Linux) in my network to said Unraid server.

I don’t want to complicate the setup with a bunch of dependencies and things that would make the recovery process long and tedious or require in-depth knowledge to use it for the recovery process.

To recap:

Workstations/laptop > back up data with what? > Unraid

Unraid > back up data with what? > Backblaze

  • Scrubbles@poptalk.scrubbles.tech
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    edit-2
    1 year ago

    I don’t do it to backblaze anymore, but the easiest way to back up is usually the simplest, and in my case that was rclone.

    RClone supports so so many backends, and one of them is backblaze.

    Set up rclone on unraid (either install it as a plugin or install it directly to the box in the shell), set up backblaze as the remote, and then set up a job to run it (again if you like UI try UserScripts, or if you want to do this bare metal than just add it to crontab)

    I have my rclone split up into multiple shares personally because I like the granularity, then I run each one every night. I have a few protections turned on too so it can detect if too much has changed (like a ransomware attack) to kill the job and not run, sending me an error instead.

    Bonus points, you can use a -crypt style thing on top of it if you’d like to encrypt your data uploaded to backblaze as well.

    For workstation to unraid you have choices on how to set unraid as your backend

    • SSH, easiest, probably most secure
    • Local, meaning you’ve mounted your unraid as a local mount point/drive in Windows and just want to copy, it’ll use the mount as the connection
    • (S)FTP, would be more involved but just another option
    • I’m sure there’s a few more, I use SSH and/or local for workstation to unraid.

    Or you can use whatever your OS bundles too, if you can mount a share in your OS (Idk what OS you can’t do that) then you could use the built in backup utility). Play around, see what works best for you.

    • lal309@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Mind sharing the script (without the sensitive data)?

      I haven’t taken the plunge into rclone because of the scripting part of the equation. Just not great at bash.

      • Scrubbles@poptalk.scrubbles.tech
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        It’s pretty self run, if you’re a bit nervous, try making a folder locally on your workstation and try syncing with backblaze, just to get a feel for it. We all do a few test runs first anyway. Essentially you’ll need to

        1. Install rclone
        2. run rclone config, this will guide you step by step through adding a remote, adding credentials, etc, there’s a full guide here
        3. Try running a copy job for your test folder. It’ll be something like rclone copy /path/to/local/folder remotenameyoucreated:/

        Then go check and see if it showed up in back blaze. Play with some of the flags, like maybe you want -v so it’ll print out everything it does. During testing --dry-run can be a lifesaver.

        One big caveat, make sure you read the manual on the difference between rclone copy and rclone sync, make sure you understand both before choosing on one of them.

        I’m purposely leaving out my script because I think if you’re getting started in scripting you should start small. There’s no big script that will apply to every system. Just try copying one directory to backblaze first on your workstation, then try another, try adding those to a script and running them as a script. Mine really did start as

        #! /bin/bash
        rclone copy /mnt/user/myshare myremote:/myshare
        

        It’ll just grow with time as you add more to it, add caveats, add rules, etc. Our labs are evolving for sure, there’s no silver bullet answer. Good luck, good testing, and ping back here if you have any other questions!

        • lal309@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          My fault! This is exactly what I was going to do as I’m unfamiliar with rclone. My comment about the script was to see how to do checks to make sure the remote is actually available before trying, how to make sure the job ran successfully, how to send a notification upon success/failure, etc.

          I’m pretty novice at bash but I know other languages very well. Concepts apply more or less the same across languages, the only thing that changes in most cases is syntax.

          Anyways, good write up and I appreciate your feedback.

          • Scrubbles@poptalk.scrubbles.tech
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I’m still working on the amount changed, making sure I’m happy with it before it’s completely automated, but I’m working on parsing the results of summary and --dry-run on a precheck, so if oh, 10% of the files would be changed or more than cancel it, send me a notification, and for that I’d run it manually myself. Still fine tuning and not quite happy with it yet.

            My first iteration was to cherry pick a few “key” files that would be randomly around my file system. Things that I will probably never ever change. This has been proven to work for a long time and honestly has saved my ass because I did accidently wipe out a few files once and this verified that my backups wouldn’t run until I fixed it. It’s a bit dumb but it did the trick for me:

            #!/bin/bash
            
            # check_file takes in a path to a local file and it's "known good hash"
            function check_file {
              actual=($(md5sum "$1"))
              if [[ "$actual" != "$2" ]]; then
                echo "ERROR: $1 did not match it's hash value."
                echo "$actual != $2"
                echo "Possible attack.  Exiting"
                exit 1
              fi
              echo "Validated $1 matches the checksum on file"
            }
            
            echo Starting Safety Checks
            
            check_file /mnt/user/myshare/mything b04b917c1f66e52adf2722d35f9b51b6
            # about 5 random per share
            
            rclone copy /mnt/user/myshare myremote:/myshare
            

            Like I said, don’t judge too much, but for me it’s a “poor man’s ransomware checker”. If any of those have been modified, do not perform a backup, notify me, and shutdown.

            • lal309@lemmy.worldOP
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              The script I’m building is coming up nicely. I have it checking for error codes, logging to a file and sending discord notifications. Next step is to finish up all the rclone copy’s and syncs I want to do and then trying to encrypt the data client side before uploading to cloud. I’ve seen mention of crypt but can’t find it in the official rclone docs (yet).

              • Scrubbles@poptalk.scrubbles.tech
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                crypt is pretty simple honestly, go through prebuild but it’s essentially just sitting on top of another remote. So I usually use the convention {remotename}-crypt. It’ll simply rename and encrypt anything and then pass it into the remove you already set up. A good way to test it is once you’ve copied some stuff over with the crypt set up try to mount it using rclone mount and try to read some of your files, it should be fine.

                Don’t forget to back up (securely, and obviously not on your encrypted share) your rclone config! if those keys are lost than so is your data.

                • lal309@lemmy.worldOP
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  I don’t need the whole rclone.conf file right? Just the authentication stuff for the underlying remote, the password for crypt and the salt for crypt. If I have that shouldn’t I be able to recreate the access and subsequently the decryption key to be used on some other machine?

  • Limit@lemm.ee
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    1 year ago

    I’m a big fan of duplicati. You can install it on Linux, windows, (not sure about mac) and use it to send backups anywhere. Backup to your nas, to s3, smb share, whatever.

    • lal309@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      I’ve mostly stayed away from duplicati because of the “horror stories” around restore operations. Quotes because sometimes people exaggerate but other times they are legitimate concerns/bad experiences.

    • Research8165@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      I’ve generally had a positive experience with duplicati to backup unraid to backblaze. Recently had time to test my encrypted backup, and had no problem restoring it.

      • Limit@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        I use it to send backups to backblaze b2 also, it works very well for me.

  • lal309@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    UPDATE: Decide to give rclone a try and try to automate it all through scripts. So far I the rclone script checking for errors, logging to a file and sending discord notifications.

  • camr_on@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    I’m partial to duplicacy, not to be confused with duplicati. Has worked great for me

  • xyguy@startrek.website
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    I use Syncthing on all my endpoints Windows and Linux (can’t speak for Mac) to sync to my TrueNAS server. It has a built in tool to just back up to backblaze on a certain schedule.

    I know you can use Syncthing with unraid in Docker. I have it set up so sync all endpoints to my server and then the server pushes the latest changes back to all the endpoints. This is overly redundant and you don’t have to do it that way but all endpoints and my server would have to die at the same time before I lost any data. It’s sort of a backup scheme in and on itself.

  • bartolomeo@suppo.fi
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Sorry, this question might just reveal my ignorance, but what is the advantage of using all these programs over just using rsync? Yes, I am old and simple, but I’d love to know.

    • witten@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      For starters: encryption at rest, block-level deduplication across backups, cloud storage, database support, container support, etc etc.

  • rambos@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    I use nextcloud to sync files from PC/phone to server and then Kopia CLI for daily backup to backblaze (also 2nd local backup). I start kopia web server for easier restore, but luckily I never had to do a real restore, only testing backups