this post was submitted on 19 Nov 2023
5 points (100.0% liked)

Self-Hosted Main

504 readers
1 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

For Example

We welcome posts that include suggestions for good self-hosted alternatives to popular online services, how they are better, or how they give back control of your data. Also include hints and tips for less technical readers.

Useful Lists

founded 1 year ago
MODERATORS
 

Having been so meticulous about taking back ups, I’ve perhaps not as been as careful about where I stored them, so I now have a loads of duplicate files in various places. I;ve tried various tools fdupes, czawka etc. , but none seems to do what I want.. I need a tool that I can tell which folder (and subfolders) is the source of truth, and to look for anything else, anywhere else that’s a duplicate, and give me an option to move or delete. Seems simple enough, but I have found nothing that allows me to do that.. Does anyone know of anything ?

you are viewing a single comment's thread
view the rest of the comments
[–] xewgramodius@alien.top 1 points 11 months ago (1 children)

I don't think there is a good way to tell which two duplicate files was "first" other than checking Creation Date but if this is Linux that attribute may not be enabled in your fs type.

The closest thing I've seen is a python dedup scripts but after it identifies all the dups it deletes all but one of them and then puts hard links, to that real file, where all the deleted dups were.

[–] parkercp@alien.top 1 points 11 months ago

Hi @xewgranodius - I’m not actually worried about which came first, the key thing for me is which one is located in the directory (source) of truth. If it’s not in there then it’s fair game and can be moved/deleted..