That Fine Line Between Scripts and Applications
I've been working on fixing up a script that takes a Bloomberg field definition file and generates a bunch of SQL statements to populate a database with this field definition data and it's right on the edge of really needing to be an application. It's all in bash now, but it really ought to be in perl or maybe even Java or C++. It's thousands of lines, so an application that can detect the existing data easily and only update the relevant records is going to get a lot way towards making this better. Perl could do it, but this bash scripting is just awfully limited.
Oh, I'm going to finish it, because I'm nearly done, and there's nothing in the requirements of the task that can't be done with bash and the other unix tools, it's just that had I originally known the issues with this updated file from Bloomberg, I'd have probably opted for a Perl script from the get-go.
There are spurious backslashes in the file, so I have to sed them out and make a temporary file of that. Then there are the problems with the intended primary key - this time around the file has duplicates on the primary key because it's really not meant to be unique from Bloomberg's point of view, I was just using it as a primary key because the first version of the file was unique. Silly me.
So something that should have taken five minutes is now in it's second hour as I find each of these issues in the 9000+ line file. The box it's running on is not slow, it's just there's a lot of stuff to do, and I'm not being really efficient because each time I'm blowing away all the data and regenerating it. Again, I was thinking this was 'easy'. Silly me.
The lesson in all this is that no one is right all the time, and even if you're right today, tomorrow will bring facts and circumstances totally unknown today and they will make the decision wrong. We have to be flexible and willing to see what's right and wrong and fix it - even if it means re-writing the entire process.