Having been so meticulous about taking back ups, I’ve perhaps not as been as careful about where I stored them, so I now have a loads of duplicate files in various places. I;ve tried various tools fdupes, czawka etc. , but none seems to do what I want… I need a tool that I can tell which folder (and subfolders) is the source of truth, and to look for anything else, anywhere else that’s a duplicate, and give me an option to move or delete. Seems simple enough, but I have found nothing that allows me to do that… Does anyone know of anything ?

  • parkercp@alien.topOPB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I’d like to find something that has that capability- so I can say multimedia/photos/ is the source of truth - anything identical found elsewhere is a duplicate. I hoped this would be an easy thing to as the ask is simply to ignore any duplicates in a particular folder hierarchy…

    • lilolalu@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Well that’s possible with a lot of deduplicators. But I’d take a look at duff:

      https://manpages.ubuntu.com/manpages/xenial/man1/duff.1.html

      https://github.com/elmindreda/duff

      The duff utility reports clusters of duplicates in the specified files and/or directories. In the default mode, duff prints a customizable header, followed by the names of all the files in the cluster. In excess mode, duff does not print a header, but instead for each cluster prints the names of all but the first of the files it includes.

       If no files are specified as arguments, duff reads file names from stdin.