Great Script to tidy Up our Photos

I was looking for a way to tidy up out photos on the NAS at home and have tried a number of things that just did not fit the bill.

they were either just to difficult or completely wrong.

i then stumbled upon this blog post http://falesafe.wordpress.com/2009/07/07/photo-management/

What a gem. if you can install ruby on a machine you have to sort your photos this is it.

The post is from 2009 and so I had to update some of the gems it uses as well as change some of the code. but it was not much work.

Thanks to Falesafe for making it available it had another bonus in that I found out that we have 76000 photos.

I feel some culling is needed.

EDIT:

I had to work with the script a bit as the EXIF attribute it was using was causing my photos to be sorted incorrectly namely (date_time).
So I have updated the script to use the (date_time_original) atribute and this has now sorted my photos properly for me. The original post that was written has comments that are closed so I will upload the adjusted script here if you want to use it.

#!/usr/bin/ruby
# == Synopsis
#
# This script examines a source directory for photos and movie files and moves them to
# a destination directory.  The destination directory will contain a date-hierarchy of folders.
#
# == Usage
#
# ruby photo_organizer.rb [ -h | --help ] source_dir destination_dir
#
# == Author
# Doug Fales, Falesafe Consulting, Inc.
# 
# == Change Log
# LANCE HAIG = changed the EXIF attribute used to determine photo date taken to .date_time_original
#
# == Copyright
# Copyright (c) 2009 Doug Fales.
# Licensed under the same terms as Ruby.
require 'rubygems'
require 'exifr'
require 'find'
require 'logger'
require 'optparse'
require 'pathname3'
require 'digest/sha3'

STDOUT.sync = true


#$log = Logger.new("photo_organizer.log", 3, 20*1024*1024)  # Log files up to 20MB, keep at least three around
#$log.info("Photo organizer started...")

def log
	@log ||= Logger.new("photo_organizer.log", 3, 20*1024*1024)  # Log files up to 20MB, keep at least three around
	@log
end

log.info("Photo organizer started...")


def usage()
puts < e
			if(f =~ /.DS_Store/)
				log.info("Skipping .DS_Store")
				next
			elsif (e.message =~ /malformed JPEG/)
				log.info("Malformed JPEG: #{f}")
				next
			end
		end

		if(time.nil?) 
			log.info("WARNING: No EXIF time for: #{f}.  Will skip it.")
			next
		end

		was_moved = move_image(f, time)
		increment_counter if was_moved

	when File.directory?(f)
		log.info("Processing directory: #{f}")
	else "?"
		log.info("Non-dir, non-file: #{f}")
	end
end

puts "\nFinished."

Quest NDS Migrator LogFile Parser

I am currently helping a customer migrate from Novell to Microsoft and they are using the Quest migrator product to move their data to new DFS servers.
They currently have a large amount of data stored on a number of volumes. The sheer number of volumes and data have required that they deploy a large number of the copy engine servers.
The copy engine does not utilize a central logging facility, it stores the logfiles in a folder alongside the copy engine.
This unfortunately has a side affect, that there are now quite a few log files and some are reaching over 1.5GB in size.
Trying to load these files into a text editor as proven impossible and unworkable and another way was needed.

I decided that the best way to achieve this was to use a script that would parse the log files and extract the errors from the files into another file that would be smaller and easier to work with.

I decided to use Powershell as the scripting language as it would run on the new infrastructure and could be run on a copy engine server with enough disk space.

I undertook quite a bit of research and trial and error but eventually I have a working script.

This script is not signed so you will either need to sign the script to run it or elevate the privileges with set-Executionpolicy on the system you are going to be using.

The script uses two files the main script file and a csv file with the volume names and copy engine server names.

Below you will find a copy of both. I have also created a git repository that you can find on GitHub if you would like to help make it better

Original PowerShell Script

[codebox 1]

Original CSV File

[codebox 2]