this post was submitted on 31 Jul 2024
23 points (96.0% liked)

Selfhosted

40006 readers
654 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

I have a proxmox+Debian+docker server and I'm looking to setup my backups so that they get backed up (DUH) on my Linux PC whenever it comes online on the local network.

I'm not sure if what's best is backing up locally and having something else handling the copying, how to have those backup run only if they haven't run in a while regardless of the availability of the PC, if it's best to have the PC run the logic or to keep the control over it on the server.

Mostly I don't want to waste space on my server because it's limited...

I don't know the what and I don't know the how, currently, any input is appreciated.

top 15 comments
sorted by: hot top controversial new old
[–] schizo@forum.uncomfortable.business 12 points 3 months ago (2 children)

I see syncthing being recommended, and like, it's fine.

But keep in mind it's NOT a backup tool, it's a syncing tool.

If something happens to the data on your client, for example, it will happily sync and overwrite your Linux box's copy with junk, or if you delete something, it'll vanish in both places.

It's not a replacement for recoverable-in-a-disaster backups, and you should make sure you've got a copy somewhere that isn't subject to the client nuking it if something goes wrong.

[–] daddy32@lemmy.world 3 points 3 months ago

This is very important distinction to be made. Sync is not a backup.

However, you can get 90% there with Syncthing when you enable file versioning or at least trash can for the files.

[–] dwindling7373@feddit.it 1 points 3 months ago* (last edited 3 months ago) (1 children)

Thanks for the heads up, yea I'm well aware of that, I use it to, well... sync, my phone pictures with my PC.

[–] peregus@lemmy.world 1 points 3 months ago

You could use SyncThing and then run a backup on synched folder on the server

[–] conrad82@lemmy.world 8 points 3 months ago

I use syncthing to copy important files between pc, phone and proxmox server. Syncthing can be set up with version control so it keeps old versions of files.

Only the proxmox server is properly backed up though. to a proxmox backup server running in a VM on said proxmox server. the encryptred backup files are copied to backblaze using rclone

Not sure if this is what you are looking for, but it works for me.

TLDR syncthing for copies between local machines, and proxmox backup server and backblaze for proper backups

[–] Illecors@lemmy.cafe 4 points 3 months ago

I'm not the best person to query about backups, but in your situation I would do the following, assuming both server and desktop run on BTRFS:

Have a script on the desktop that starts btrfs-receive and then notifies the server that it should start btrfs-send.

You can also do rsync if BTRFS is not a thing you use, but It would either be expensive storage wise, or you would only ever have 1 backup - latest.

[–] anzo@programming.dev 3 points 3 months ago

As to how, I'd probably use zfs send | receive, any built-in functionality on a CoW filesystem, rsnapshot, rclone or just syncthing. As to when, I'd probably hack something with systemd triggers (e.g. on network connection, send all remaining incremental snapshots). But this would only be needed in some cases (e.g. not using syncthing ;p)

[–] lupec@lemm.ee 3 points 3 months ago

Since space is a major concern, maybe have a look at borg and possibly something like borgmatic on top for easier configuration. Borg does deduplicated backups, so you could do even hourly ones if you wanted without too much extra space depending on how many you want to keep. You'd need to run a borg server wherever you want to store your backups so it's not a simple rsync over ssh situation but that's the price you pay for the extra niceties.

[–] just_another_person@lemmy.world 2 points 3 months ago (1 children)

You probably want to go into a bit more detail on exactly what you want to backup. Are you talking the entire system, flat files, databases...?

[–] dwindling7373@feddit.it 1 points 3 months ago (1 children)

Docker configs, sensitive documents, pictures, a limited amount of video files...

[–] just_another_person@lemmy.world 3 points 3 months ago (1 children)

Cron + rsync is always a bulletproof solution. Simple bash script that runs every X minutes to sync to a network target. Wouldn't need to be only when the machine starts as you mentioned.

[–] dwindling7373@feddit.it 2 points 3 months ago (1 children)

I will probably start with this approach and see where it leads me, thanks!

[–] tvcvt@lemmy.ml 1 points 3 months ago

Since you're interested in this kind of DIY, approach, I'd seriously consider thinking the whole process through and writing a simple script for this that runs from your desktop. That will make it trivial to do an automatic backup whenever you're active on the network.

Instead of cron, look into systemd timers and you can fire off your script after, say, one minute of being on your desktop, using a monotonic timer like OnUnitActiveSec=60.

Thinking through the script in pseudo code, it could look something like:

rsync -avzh $server_source $desktop_destination || curl -d "Backup failed" ntfy.sh/mytopic

This would pull the back from your server to your desktop and, if the backup failed, use a service such as ntfy.sh to notify you of the problem.

I think that would pretty much take care of all of your requirements and if you ever decided to switch systems (like using zfs send/recv instead of rsync), it would be a matter of just altering that one script.

[–] jet@hackertalks.com 2 points 3 months ago* (last edited 3 months ago)

run your backup process every few minutes, if both machines are online, it sends the latest snapshot, if not then it exits. You could reduce your snapshots to once a day, so it only does the data transfer once per day, and any extra backups just succeed.

The downside of this approach is you will need to setup alerting if a snapshot doesn't get backed up within a window, rather then alerting on a single backup process failing. (i.e. if the server doesn't have a successful backup within a week create an alert)

You could also do the same thing with rclone vfs mounts with full caching, so you could write locally even when the network is offline, and when you get online it would transfer (but you need to setup your server side alerting again)

[–] Shimitar@feddit.it 2 points 3 months ago

Synchting is the way... Been using it since ever. Easy to setup, works flawlessly, doesn't get into the way.

I use it for server, PC, laptop and android devices.

... Then you want to backup (Borg, restic...) your synched files of course ....