this post was submitted on 13 Jun 2023
4 points (100.0% liked)

Selfhosted

39256 readers
366 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

I see many posts asking about what other lemmings are hosting, but I'm curious about your backups.

I'm using duplicity myself, but I'm considering switching to borgbackup when 2.0 is stable. I've had some problems with duplicity. Mainly the initial sync took incredibly long and once a few directories got corrupted (could not get decrypted by gpg anymore).

I run a daily incremental backup and send the encrypted diffs to a cloud storage box. I also use SyncThing to share some files between my phone and other devices, so those get picked up by duplicity on those devices.

top 28 comments
sorted by: hot top controversial new old
[–] dead@keylog.zip 2 points 1 year ago

What's my what lmao?

[–] Faceman2K23@discuss.tchncs.de 1 points 1 year ago

I back up everything to my home server... then I run out of money and cross my fingers that it doesn't fail.

Honestly though my important data is backed up on a couple of places, including a cloud service. 90% of my data is replaceable, so the 10% is easy to keep safe.

I'm paying Google for their enterprise gSuite which is still "unlimited", and using rclone's encrypted drive target to back up everything. Have a couple of scripts that make tarballs of each service's files, and do a full backup daily.

It's probably excessive, but nobody was ever mad about the fact they had too many backups if they needed them, so whatever.

[–] linearchaos@lemmy.world 1 points 1 year ago

Irreplaceable media: NAS->Back blaze NAS->JBOD via duplicacy for versioning

Large ISOs that can be downloaded again, NAS -> JBOD and or NAS -> offline disks.

Stuff that's critical leaves the house, stuff that would just cost me a hell of a lot of personal time to rebuild just gets a copy or two.

[–] Amius@yiffit.net 1 points 1 year ago

Holy crap. Duplicity is what I've been missing my entire life. Thank you for this.

[–] somedaysoon@lemmy.world 1 points 1 year ago
[–] tj@fedia.io 1 points 1 year ago* (last edited 1 year ago)

I have a central NAS server that hosts all my personal files and shares them (via smb, ssh, syncthing and jellyfin). It also pulls backups from all my local servers and cloud services (google drive, onedrive, dropbox, evernote, mail, calender and contacts, etc.). It runs zfs raid 1 and snapshots every 15 minute. Every night it backs up important files to Backblaze in a US region and azure in a EU region (using restic).

I have a bootstrap procedure in place to do a "clean room recovery" assuming I lost access to all my devices - i only need to remember a tediously long encryption password for a small package containing everything needed to recover from scratch. It is tested every year during Christmas holidays including comparing every single backed and restored file with the original via md5/sha256 comparison.

[–] ptman@sopuli.xyz 1 points 1 year ago

I'm moving from rsync+duplicity+borg towards bupstash

[–] Oli@fedia.io 1 points 1 year ago

In the process of moving stuff over to Backblaze. Home PCs, few clients PCs, client websites all pointing at it now, happy with the service and price. Two unraid instances push the most important data to an azure storage a/c - but imagine i'll move that to BB soon as well.
Docker backups are similar to post above, tarball the whole thing weekly as a get out of jail card - this is not ideal but works for now until i can give it some more attention.

*i have no link to BB other than being a customer who wanted to reduce reliance on scripts and move stuff out of azure for cost reasons.

[–] kabouterke@lemmy.world 1 points 1 year ago

In short: crontab, rsync, a local and a remote raspberry pi and cryptfs on usb-sticks.

[–] hxhz@lemmy.world 1 points 1 year ago

I use a Backuppc instance hosted on an off site server with a 1Tb drive. It connects through ssh to all my vms and backups /home and any other folders i may need. It handles full and incremental backups, deduplication, and compression.

[–] kabouterke@lemmy.world 1 points 1 year ago

In short: crontab, rsync, a local and a remote raspberry pi and cryptfs on usb-sticks.

[–] Sekoia@lemmy.blahaj.zone 1 points 1 year ago

Backups? What backups?

Ik it's bad but I can't be bothered.

Backing up to backblaze with duplicacy

[–] banana1@lemmy.ca 0 points 1 year ago (1 children)

Personally I do:

  • Daily snapshots of my data + Daily restic backup on-site on a different machine
  • Daily VM/containers snapshot locally and on a different machine, keeping at least 2 monthly, 2 weekly and 2 daily backups
  • Weekly incremental data backup in an immutable B2 bucket, with a new bucket every month and a 6 month immutability (so data can't be changed/erased for 6 month)
  • Weekly incremental data backup on an other off-site machine
  • Monthly (but I should start doing it weekly) backup of important data (mainly documents and photos) on removable medias that I keep offline in a fire-proof safe

Maybe it's overkill, maybe it's not enough, I'll know when something fail and I am screwed, ahah

As a note, everybody should test/check their backup frequently. I once had an issue after changing an IP address and figured out half my backups where not working 6 month later...

[–] lhamil64@beehaw.org 0 points 1 year ago (1 children)

How do you approach testing your backups? It seems like you shouldn't just restore it to the various applications because if it fails then you're screwed. But it also seems like a huge pain to create duplicate instances of every application to test the backup.

[–] banana1@lemmy.ca 1 points 1 year ago

I do restore my VMs to deplicate VMs to test from time to time (it's pretty easy with Proxmox) but I use Restic for data backups which encrypts the data before uploading it, so one should restore a backup to a different folder to ensure the data integrity and that you didn't forget your keys ahah

You don't have to do it every week or month, but it's worth doing it a few times a year or when you change something!

[–] gerowen@lemmy.world 0 points 1 year ago

I have an external hard drive that I keep in the car. I bring it in once a month and sync it with the server. The data partition is encrypted so that even if it were to get stolen, the data itself is safe.

[–] gerowen@lemmy.world 0 points 1 year ago

I have an external hard drive that I keep in the car. I bring it in once a month and sync it with the server. The data partition is encrypted so that even if it were to get stolen, the data itself is safe.

[–] Showroom7561@lemmy.ca 0 points 1 year ago (1 children)

All devices backup to my NAS either in realtime or at short intervals throughout the day. I use recycling bins for easy restores for accidentally deleted files.

My NAS is set up on a RAID for drive redundancy (Synology RAID) and does regular backups to the cloud for active files.

Once a day I do a hyperbackup to an external HDD.

Once a month I backup to an external drive that lives offsite.

Backups to these external HDDs have versioning, so I can restore files from multiple months ago, if needed.

The biggest challenge is that as my NAS grows, it costs significantly more to expand my backups space. Cloud storage and new external drives aren't cheap. If I had an easy way to keep a separate NAS offsite, that would considerably reduce ongoing costs.

[–] homelabber@lemmy.one 0 points 1 year ago* (last edited 1 year ago) (1 children)

Depending on how much storage do you need (>30 TB?), it may be cheaper to use a colocation service for a server as an offsite backup instead of cloud storage. It's not as safe, but it can be quite cheaper, especially if for some reason you're forced to rapidly download a lot of your data from the cloud backup. (Backblaze b2 costs $0.01/gb downloaded).

[–] Showroom7561@lemmy.ca 0 points 1 year ago (1 children)

Do you have an example or website I could look at for this 'colocation service'?

Currently using idrive as the cloud provider, which is free until the end of the year, but I'm not locked into their service. Cloud backups really only see more active files (<7TB), and the unchanging stuff like my movie or music catalogue seems reasonably safe on offsite HDD backups, so I don't have to pay just to keep those somewhere else.

[–] homelabber@lemmy.one 0 points 1 year ago (1 children)

First I'd like to apologize because I originally wrote less than 30TB instead of more than 30TB, I've changed that in the post.

A colocation is a data center where you pay a monthly price and they'll house your server (electricity and internet bandwidth is usually included unless with certain limits and if you need more you can always pay extra).

Here's an example. It's usually around $99/99€ per 1U server. If you live in/near a big city there's probably at least a data center that offers colocation services.

But as I said, it's only worth it if you need a lot of storage or if you move files around a lot, because bandwidth charges when using object storage tend to be quite high.

For <7 TB it isn't worth it, but maybe in the future.

[–] Showroom7561@lemmy.ca 1 points 1 year ago

Thanks for the info. Something to consider as my needs grow 👍

[–] TheCakeWasNoLie@lemmy.world 0 points 1 year ago (1 children)

Rsync script that does deltas per day using hardlinks. Found on the Arch wiki. Works like a charm.

[–] ptman@sopuli.xyz 1 points 1 year ago (1 children)
[–] TheCakeWasNoLie@lemmy.world 1 points 1 year ago

No. Rsync works fine, and it is easily testable (untested backups are no backups)

load more comments
view more: next ›