Holy tedious manual removal of duplicate image files, Batman.

Follow

@cameron give jdupes a try:

jdupes -rSs ./ > dupes.txt

Will output all duplicates in your current folder, recursively into subfolders, and output to a txt file

Or include the -d flag to let you pick which files to keep, which to delete

Believe me, I've been clearing out hundreds of gigabytes of files for a few years now, most of them are dupes, and jdupes is invaluable.

@kensp @matt @cameron I recently used a gui application called FSLint to deduplicate my ROM files after combining two libraries of them. There is a "select using wildcard" option to make it a cinch to pick which ones to delete (for example: */ROMS2/*).

@ajdunevent @kensp @matt

Oh that's a nice lil application! I didn't know about that one, and it was already installed on my system. :) Very handy.

@matt @hund
Thanks for the ideas. I have a bunch of images that are duplicate but with different file names in the same folders, so no matter how I got at it, it's tedious. I'm getting through though. :/

@hund
I did, yes, and thank you! I have been using a combination of GThumb, File manager and fslint to dig through, but I was grumpy because I have to check each one manually no matter how I do it. It's all good though. Almost done. I cleared out about 20 GB of duplicates yesterday. :)

@cameron Ah! True. :) Cool! But how did you end up with duplicates worth 20 GB of disk space? :P

@hund
Some weird stuff with file name capitalization, giving me file.jpg and file.JPG in the same folder. And somehow copying files in the past auto added a numeral to the end of the file rather than replacing it. So I had file_1, file_2 and so on.... Nightmares!

@cameron I have probably a few of those as well. ;)

@cameron @matt You liked it, so I guess you did. :P

As you can see in the image, gThumb can find duplicate images no matter (where it is and) what filename it has.

Sign in to participate in the conversation
LinuxRocks.Online

Linux Geeks doing what Linux Geeks do..