killerasfen.blogg.se

Linux exiftool
Linux exiftool







linux exiftool
  1. #Linux exiftool how to
  2. #Linux exiftool plus
linux exiftool

Obviously, all of these commands work on any file and not just photographs. I select all but the first in each group and then click Delete. Click the Select button to choose which files to delete: all but the first in each group, all but the newest, or all but the oldest. When it’s finished it presents you with a list of files organized the same way as in our example dupes.txt. Tell it what directory or directories to search, click Find, and then go away and let it work. You can fine-tune the exclusion list to your heart’s content.įSlint is very easy to use.

#Linux exiftool plus

By default it excludes lost+found, /dev, /proc, /sys, and /tmp, plus git, CVS, svn, and bzr files. I’m sure some of you are waving your hands and going “Hey, what about FSlint?” FSlint is my tool of choice for this job.įSlint, “filesystem lint”, is a nice graphical filesystem cleaner that finds and removes duplicates, and a host of other functions such as finding redundant whitespace, empty directories, bad symlinks, files with missing user IDs, and listing your installed packages. Use the -N option to turn off the confirmation prompt and to delete all duplicates without bothering you. It is reassuring to see awk and fdupes give the same results.įdupes will also delete duplicate files with the -d option, and ask you to confirm each deletion. This example counts up all the duplicates in Pictures, and how much disk space they’re using: $ fdupes -rSm Pictures/ĥ554 duplicate files (in 4301 sets), occupying 41484.8 megabytes It has a simple set of options that you can read all about in man fdupes, and it displays a progress meter as it works. It operates like our long find command, using md5sums, with the advantage of being simpler to use. fdupes Finds and Removes Duplicate FilesĪnother way to look for duplicates is with the fdupes command, whose only job in life is to find duplicate files, and to optionally delete them.

linux exiftool

At this point I could cobble up a compound command to move or delete the duplicates, but there are easier tools to do this. So I have 4301 unique photos, and 5554 duplicates to weed out. It finds only duplicates, and will not count files that are not duplicated: $ find Pictures/ -type f -exec md5sum '' dupes.txt |wc -l It finds duplicates by generating and matching an md5sum hash on each file, and then using sort and uniq to print all the photo filenames in a text file, with duplicates listed together and separated by a blank line. This incantation will take some time to run, depending on the speed of your computer and the size of your photo archives. So how many are duplicates? Again, we turn to find.

linux exiftool

This counts all the files in Pictures without counting directories: $ find Pictures/ -type f | wc -l All of my photos are in a single directory, Pictures, so I don’t have to search multiple directories. How many photos is that? Like, a way lot. But I’m looking at 205GB of photos: $ du -sh Pictures/ Over the years I’ve created duplicates by dumping them onto my computer when I was in a hurry, and making backups without rhyme or reason, so I want to hunt down all the duplicates and get rid of them. Don’t worry, I won’t make you look at all of them. I have a 32GB card in my camera, which holds 1700+ RAW images at 18MB each. Now we can capture gigabytes of photos in just a few minutes, without reloading. In the olden days of photography we thought were ready for anything with a few 36-exposure film cassettes in our bags.

#Linux exiftool how to

Let’s learn how to find duplicates and organize all of your photos without making it your life’s work.īefore doing anything, please have a good backup of your photo files. Digital cameras make it easy to rack up gigabytes of photo archives.









Linux exiftool