this post was submitted on 17 Feb 2024
95 points (98.0% liked)

Selfhosted

40183 readers
945 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

What storage software could I run to have an archive of my personal files (a couple TB of photos) that doesn't require I keep a full local copy of all the data? I like the idea of a simple and focused tool like Syncthing, but they seem to be angling towards replication.

Is the simple choice to run some S3-like backend and use CLI or other client to append and browse files? I'd love something with fault tolerance that someone can gradually add disks to. If ceph were either less complicated or used less resources I'd want to do that.

top 50 comments
sorted by: hot top controversial new old
[–] lemmyvore@feddit.nl 24 points 9 months ago

Borg Backup. It can work locally or over network. Takes snapshots of the files you give it. Performs deduplication, compression and optionally encryption. You can check the integrity of the backups and repair them. There's a very simple to use GUI for it called Pika Backup to get you started.

[–] Everythingispenguins@lemmy.world 23 points 9 months ago (1 children)

Punch cards. Is it the best no but no one is going to bother to steal my data. Encryption through inconvenience

[–] jkrtn@lemmy.ml 19 points 9 months ago

Do a riffle shuffle to make them even more secure!

[–] skilltheamps@feddit.de 15 points 9 months ago (1 children)

that doesn't require I keep a full local copy of all the data

If you don't do that, the place that you call "backup" is the only place where it is stored - that is not a Backup. A backup is an additional place where it is stored, for the case when your primary storage gets destroyed.

[–] jkrtn@lemmy.ml 1 points 9 months ago (1 children)

"Local" as in the machine I am using to work on, which has a 256 GB SSD. Not as in "on-site" and "off-site."

[–] computergeek125@lemmy.world 8 points 9 months ago* (last edited 9 months ago) (1 children)

In the IT world, we just call that a server. The usual golden rule for backups is 3-2-1:

  • 3 copies of the data total, of which
  • 2 are backups (not the primary access), and
  • 1 of the backups is off-site.

So, if the data is only server side, it's just data. If the data is only client side, it's just data. But if the data is fully replicated on both sides, now you have a backup.

There's a related adage regarding backups: "if there's two copies of the data, you effectively have one. If there's only one copy of the data, you can never guarantee it's there". Basically, it means you should always assume one copy somewhere will fail and you will be left with n-1 copies. In your example, if your server failed or got ransomwared, you wouldn't have a complete dataset since the local computer doesn't have a full replica.

I recently had a a backup drive fail on me, and all I had to do was just buy a new one. No data loss, I just regenerated the backup as soon as the drive was spun up. I've also had to restore entire servers that have failed. Minimal data loss since the last backup, but nothing I couldn't rebuild.

Edit: I'm not saying what your asking for is wrong or bad, I'm just saying "backup" isn't the right word to ask about. It'll muddy some of the answers as to what you're really looking for.

[–] jkrtn@lemmy.ml 2 points 9 months ago (1 children)

Yes, I do see that. I'm definitely getting answers to a question I didn't intend. I was hoping for more of an rsync but that something which also provides viewing and incremental backups to an offsite. I don't know how to phrase that, and perhaps for what I want it makes more sense to have rsync/rclone to copy files around and something else to view.

load more comments (1 replies)
[–] solrize@lemmy.world 10 points 9 months ago (4 children)

I use Borg Backup to a Hetzner storage box but doing the same thing to a disk array would work fine. How much data are you talking about? What is the usage picture? Backup and archiving are really not the same thing.

load more comments (4 replies)
[–] GBU_28@lemm.ee 9 points 9 months ago (1 children)
[–] anzo@programming.dev 5 points 9 months ago

Rclone.org is poetry then ;)

[–] deegeese@sopuli.xyz 8 points 9 months ago (1 children)

Are we talking personal offsite backup, or a commercial cloud service?

For cloud backups I like BackBlaze but I’ve never tried to use it as a general cloud storage drive.

[–] jkrtn@lemmy.ml 2 points 9 months ago (1 children)

This would be self-hosted and local, one of the locations in a 3-2-1 strategy. BackBlaze would work for an offsite but I already have that portion covered.

[–] deegeese@sopuli.xyz 2 points 9 months ago (8 children)

that doesn't require I keep a full local copy of all the data

So you want a local self hosted backup, but also not a full copy? So like backup only recently changed files?

load more comments (8 replies)
[–] const_void@lemmy.ml 7 points 9 months ago

rsync and another hard drive

[–] DeltaTangoLima@reddrefuge.com 6 points 9 months ago (2 children)

I use rclone, with encryption, to S3. I have close to 3TB of personal data backed up to S3 this way - photos, videos, paperless-ngx (files and database).

Only readable if you have the passwords configured on my singular backup host (a RasPi), or stored in Bitwarden.

[–] ironsoap@lemmy.one 2 points 9 months ago (1 children)

This alongside using Backblaze is what I would suggest assuming you are thinking online. Cheap and reliable, also relatively easy via a cron job. https://help.backblaze.com/hc/en-us/articles/1260804565710-Quickstart-Guide-for-Rclone-and-B2-Cloud-Storage

[–] DeltaTangoLima@reddrefuge.com 3 points 9 months ago

Backblaze don't have a POP in my country, unfortunately.

[–] nullPointer@programming.dev 1 points 9 months ago* (last edited 9 months ago)

tarsnap makes use of S3. does a decent deduplication job as well

[–] YurkshireLad@lemmy.ca 5 points 9 months ago (1 children)
[–] jkrtn@lemmy.ml 1 points 9 months ago (1 children)

That's top of my list for moving the files if I do an S3 or WebDAV backend. I'm overthinking this, aren't I? Just find a WebDAV server, set it up, use rclone to append files and pretty much everything else will be able to browse.

[–] YurkshireLad@lemmy.ca 2 points 9 months ago

Haha it's easy to overthink things sometimes. I'm guilty of that. I'm using SFTPGo at home to serve files from a small server.

[–] atzanteol@sh.itjust.works 4 points 9 months ago* (last edited 9 months ago)

Sounds like something like "git annex" is what you're looking for?

I use this to manage all my photos. It lets you add binaries and synchronize then to a backend server (can be local, can be s3, back blaze, etc).

You can then "drop" files and it ensures a remote exists first. And when you drop the file your still see a symlink of it locally (it's broken) so that you know it exists.

My workflow is to add my files, sync them to both a local server and b2, then I drop and fetch folders as i need (need disk space? "git annex drop 2022*", want to edit some photos? "git annex get 2022_10_01".

[–] JakenVeina@lemm.ee 4 points 9 months ago

rsync, for sure. That's what I used when I had to migrate a 10TB datastore to a new machins.

[–] francisfordpoopola@lemmy.world 4 points 9 months ago

Where will the target be? Online or local? Rsync is really easy to use and the target files are browse-able. I could be too dense but I find online buckets aren't easily browse-able. Even a homemade NAS might be a good choice and it's easily scalable.

[–] zeluko@kbin.social 4 points 9 months ago* (last edited 9 months ago) (1 children)

So i understood you just want some local storage system with some fault tolerance.
ZFS will do that. Nothing fancy, just volumes as either blockdevice or ZFS filesystem.

If you want something more fancy, maybe even distributed, check out storage cluster systems with erasure coding, less storage wasted than with pure replication, though comes at reconstruction cost if something goes wrong.

MinIO comes to mind, tough i never used it.. my requirements seem to be so rare, these tools only get close :/
afaik you can add more disks and nodes more or less dynamically with it.

[–] jkrtn@lemmy.ml 1 points 9 months ago

Yeah it's hard to find something that perfectly fits just what you want. I think it's better if I do something simple like ZFS and maybe some kind of file server on top.

[–] hperrin@lemmy.world 4 points 9 months ago (2 children)

All of my machines back up to my home server’s RAID over WebDAV with Nephele.

Then every few days I’ll manually sync them to a server at my parents’ house with a single huge HDD using rsync. I do this manually so that if anything happens to my home server (like ransomware) it doesn’t mirror destroyed data.

Since the Nephele share is just WebDAV, I can mount it locally and move things into it that I don’t want local anymore.

I created Nephele, and I just finished writing an encryption plugin. I wrote it because I’m also going to write an S3 adapter. That way, you can store things in S3, but they’ll be encrypted, so Amazon can’t see them.

[–] blackbirdbiryani@lemmy.world 2 points 9 months ago (2 children)

Wouldn't syncing automatically every few days give you the same protection though?

load more comments (2 replies)
[–] jkrtn@lemmy.ml 2 points 9 months ago

This is really cool. I ended up trying something similar: serving from a ZFS pool with SeaweedFS. TBD if that's going to work for me long term.

I would definitely be able to manually sync the SeaweedFS files with rsync to another location but from what I see it requires me to use their software to make sense of any structure. I might be able to mount it and sync that way, hopefully performance for that is not too bad.

Syncing like that and having more control over where the files are placed on the RAID is very cool.

[–] thagoat@lemmy.sdf.org 3 points 9 months ago* (last edited 9 months ago)

Hetzner storage box, rsync and a bash script

[–] boblemmy@lemmy.world 3 points 9 months ago
[–] computergeek125@lemmy.world 3 points 9 months ago* (last edited 9 months ago)

What platform?

Another user said it - what your asking for isn't a backup, it's just data transfer.

It sounds like you're looking for a storage backend that hosts all your data and can download data to the client side on the fly.

If your use case is Windows, Nextcloud Desktop may be what you looking for. I have a similar setup with the game clips folder. It detects changes and auto uploads then, while deleting less recently used data that's properly server side. This feature might be in Mac but I haven't tested it.

Backup wise, I capture an rsync of the nextcloud database and filesystem server-side and store it on a different chassis. That then gets backed up again to a USB drive I can grab and run.

Nextcloud also supports external storage, which the server directly connects to: https://docs.nextcloud.com/server/latest/admin_manual/configuration_files/external_storage_configuration_gui.html

[–] Alpha71@lemmy.world 3 points 9 months ago
[–] Dremor@lemmy.world 2 points 9 months ago

You could run a WebDAV server, like Nextcloud.

On windows it supports thin sync (meaning that it keep a reference to the file instead of the whole file), on Linux not yet, as it is still in alpha (but you can just connect it as a remote disk and be done with it. That's how I do with mines).

If you don't want the whole Nextcloud, there are standalone cli WebDAV servers.

[–] BearOfaTime@lemm.ee 2 points 9 months ago

Syncthing can do send only. It's pretty configurable.

But I'd probably use a cloud storage like storj.io, and tools like duplicati.

[–] spez_@lemmy.world 1 points 8 months ago

Restic with Backrest: https://forum.restic.net/t/backrest-a-cross-platform-backup-orchestrator-and-webui-for-restic/7069

Although I use ResticProfile atm with RClone to sync to backblaze B2

[–] Nomecks@lemmy.ca 1 points 9 months ago

Save your files to a local s3 object storage mount, enable versioning for immutability and use erasure coding for fault tolerance. You can use Lustre or some other S3 software for the mount. S3 is great for single user file access. You can also replicate to any cloud based S3 for offsite.

[–] knobbysideup@sh.itjust.works 1 points 9 months ago (1 children)

Borg. With rsync.net if you want to keep an off-site.

[–] iturnedintoanewt@lemm.ee 2 points 9 months ago (3 children)

Is there a decent UI for borg, or is it all CLI?

[–] PrecisePangolin@lemmy.ml 2 points 9 months ago

Pika backup seems to be mentioned a lot.

load more comments (2 replies)
load more comments
view more: next ›