In the the previous part we looked at making the original script survive a restart without losing progress. There is actually a built-in PowerShell system which allows this functionality, the workflow. If you run a workflow as a job it allows you to pause, resume and restart the workflow so progress is saved.
The syntax is pretty straight-forward, but there are some strange rules about using workflows which makes it a little more tricky. Continue reading “PowerShell : Finding Duplicate Files, Part 3 : A Resumeable Script Using Workflow”
The main issue I had with my original script here was that with the sheer number of pictures we had it didn’t finish in a reasonable time. What I needed was a way to allow the script to work and resume from an interrupt (like a reboot). So I took the original script and whacked it with the ScriptHammer(TM) again.
Updated script after the jump with notes following!
Continue reading “PowerShell : Finding Duplicate Files, Part 2 : A Resumeable Script”
We have a lot of photos, music and files. Normally we copy the files up to some network storage (so it’s backed up) and later on we come back and name and sort everything. But then, some chaos happens; maybe we get distracted so end up copying the files twice. Or maybe we’ve been away and I’ve backed them up to another device and then copied two sets of files later. Or a set of files were copied to another computer and some of them (but not all) were modified before copying them back.
The upshot is that there’s precious NAS storage being wasted. But how to find where the duplicate files are? Sounds like an
excuse reason to put on my scripting hat!
Subsequently to writing this I did some more work on the script. So now there are two more parts; this post covers the basic script while part 2 details putting code in to allow resuming the script after a restart (or crash) and finally part 3 does the same thing but using PowerShell workflows.
Continue reading “PowerShell : Finding Duplicate Files, Part 1 : The Basic Script”