this post was submitted on 19 Mar 2025
52 points (90.6% liked)

Linux

56456 readers
702 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 6 years ago
MODERATORS
 

I recently implemented a backup workflow for me. I heavily use restic for desktop backup and for a full system backup of my local server. It works amazingly good. I always have a versioned backup without a lot of redundant data. It is fast, encrypted and compressed.

But I wondered, how do you guys do your backups? What software do you use? How often do you do them and what workflow do you use for it?

(page 2) 14 comments
sorted by: hot top controversial new old
[–] haque@lemm.ee 1 points 3 months ago

I use Duplicacy to backup to my TrueNAS server. Crucial data like documents are backed up a second time to my GDrive, also using Duplicacy. Sadly it's a paid solution, but it works great for me.

[–] tankplanker@lemmy.world 1 points 3 months ago

Borg daily to the local drive then copied across to a USB drive, then weekly to cloud storage. Script is triggered by daily runs of topgrade before I do any updates

[–] Gieselbrecht@feddit.org 1 points 4 months ago (4 children)

I'm curious, is there a reason why noone uses deja-dup? I use it with an external SSD on Ubuntu and (receently) Mint, where it comes pre-installed, and did not encounter Problems.

load more comments (4 replies)
[–] bubbalouie@lemmy.ml 1 points 4 months ago

I rsync ~/ to a USB nub. A no brainer.

[–] lattrommi@lemmy.ml 1 points 4 months ago

I want to say I'm glad you asked this and thank you for asking. In this day and age there are a lot of valid concerns for privacy and anonymity and the result is that people do not share how their system(s) work, not openly or very often. I'm still fairly new to Linux (3.5 years) and at times, I feel like I am doing everything wrong and that there is probably a better way. Posts like these help me learn about possible improvements or mistakes I might have made.

I previously used Vorta with Borgbackup locally, automatically backing up my Home (sans things like .cache and .mozilla) to a secondary internal drive every other day. I also would manually back up a smaller set of important documents (memes and porn #joke) to a USB flash drive, to keep on my person, which also would be copied across several cloud storage providers (dropbox, mega, proton), depending on how much space their free versions provided, with items removed according to how much I trusted the provider.

Then I built a new system. In the process of setting it all up, I had a few hiccups. It took longer than I expected to have a stable system. That was over a year ago (stat / ...Birth: 2024-02-05 04:20:53...) and I still haven't gotten around to setting up any backup system on it. I want to rethink my old solution and this post is useful for learning about the options available. It's also a reminder to get it done before it is too late. Where I live, tornado season in starting. I lost a lot in 2019 after my city had 4 tornados in one day.

[–] bitcrafter@programming.dev 1 points 3 months ago

I created a script that I dropped into /etc/cron.hourly which does the following:

  1. Use rsync to mirror my root partition to a btrfs partition on another hard drive (which only updates modified files).
  2. Use btrfs subvolume snapshot to create a snapshot of that mirror (which only uses additional storage for modified files).
  3. Moves "old" snapshots into a trash directory so I can delete them later if I want to save space.

It is as follows:

#!/usr/bin/env python
from datetime import datetime, timedelta
import os
import pathlib
import shutil
import subprocess
import sys

import portalocker

DATETIME_FORMAT = '%Y-%m-%d-%H%M'
BACKUP_DIRECTORY = pathlib.Path('/backups/internal')
MIRROR_DIRECTORY = BACKUP_DIRECTORY / 'mirror'
SNAPSHOT_DIRECTORY = BACKUP_DIRECTORY / 'snapshots'
TRASH_DIRECTORY = BACKUP_DIRECTORY / 'trash'

EXCLUDED = [
    '/backups',
    '/dev',
    '/media',
    '/lost+found',
    '/mnt',
    '/nix',
    '/proc',
    '/run',
    '/sys',
    '/tmp',
    '/var',

    '/home/*/.cache',
    '/home/*/.local/share/flatpak',
    '/home/*/.local/share/Trash',
    '/home/*/.steam',
    '/home/*/Downloads',
    '/home/*/Trash',
]

OPTIONS = [
    '-avAXH',
    '--delete',
    '--delete-excluded',
    '--numeric-ids',
    '--relative',
    '--progress',
]

def execute(command, *options):
    print('>', command, *options)
    subprocess.run((command,) + options).check_returncode()

execute(
    '/usr/bin/mount',
    '-o', 'rw,remount',
    BACKUP_DIRECTORY,
)

try:
    with portalocker.Lock(os.path.join(BACKUP_DIRECTORY,'lock')):
        execute(
            '/usr/bin/rsync',
            '/',
            MIRROR_DIRECTORY,
            *(
                OPTIONS
                +
                [f'--exclude={excluded_path}' for excluded_path in EXCLUDED]
            )
        )

        execute(
            '/usr/bin/btrfs',
            'subvolume',
            'snapshot',
            '-r',
            MIRROR_DIRECTORY,
            SNAPSHOT_DIRECTORY / datetime.now().strftime(DATETIME_FORMAT),
        )

        snapshot_datetimes = sorted(
            (
                datetime.strptime(filename, DATETIME_FORMAT)
                for filename in os.listdir(SNAPSHOT_DIRECTORY)
            ),
        )

        # Keep the last 24 hours of snapshot_datetimes
        one_day_ago = datetime.now() - timedelta(days=1)
        while snapshot_datetimes and snapshot_datetimes[-1] >= one_day_ago:
            snapshot_datetimes.pop()

        # Helper function for selecting all of the snapshot_datetimes for a given day/month
        def prune_all_with(get_metric):
            this = get_metric(snapshot_datetimes[-1])
            snapshot_datetimes.pop()
            while snapshot_datetimes and get_metric(snapshot_datetimes[-1]) == this:
                snapshot = SNAPSHOT_DIRECTORY / snapshot_datetimes[-1].strftime(DATETIME_FORMAT)
                snapshot_datetimes.pop()
                execute('/usr/bin/btrfs', 'property', 'set', '-ts', snapshot, 'ro', 'false')
                shutil.move(snapshot, TRASH_DIRECTORY)

        # Keep daily snapshot_datetimes for the last month
        last_daily_to_keep = datetime.now().date() - timedelta(days=30)
        while snapshot_datetimes and snapshot_datetimes[-1].date() >= last_daily_to_keep:
            prune_all_with(lambda x: x.date())

        # Keep weekly snapshot_datetimes for the last three month
        last_weekly_to_keep = datetime.now().date() - timedelta(days=90)
        while snapshot_datetimes and snapshot_datetimes[-1].date() >= last_weekly_to_keep:
            prune_all_with(lambda x: x.date().isocalendar().week)

        # Keep monthly snapshot_datetimes forever
        while snapshot_datetimes:
            prune_all_with(lambda x: x.date().month)
except portalocker.AlreadyLocked:
    sys.exit('Backup already in progress.')
finally:
    execute(
        '/usr/bin/mount',
        '-o', 'ro,remount',
        BACKUP_DIRECTORY,
    )
[–] Cysioland@lemmygrad.ml 1 points 3 months ago

Borg to a NAS, and that mirrored to Backblaze

load more comments
view more: ‹ prev next ›