Today I got a call from the original developer of this project I've somewhat taken over. He still has components of this project that are relatively new, and I haven't even been introduced to. Needless to say, there's no documentation whatsoever - other than what I've put into the codebase as I've been working on it.
So today when he called and said that one of his reports was missing a lot of data, I knew it was not going to be fun. He was really busy with this other project that his desk was depending on, and it was going to be up to me to figure out what had happened and how to get it back to "right".
Documentation: Zero
As I've said, this guy didn't document a thing. There's a few high-level 10,000 ft. documents about how things are supposed to work. Problem is, this project was built with Spring and while it's certainly got benefits, I have to say making a reasonable, understandable, documented system isn't one of them. Confused? Sure. Exotic XML config files? Roger.
I don't mean to harp on this, but I'm going to because it's just that important. If you can describe the system at a high level as simple and flexible, then the implementation ought to be simple and flexible. It seems that Java has created this entire cadre of developers that over-design even the simplest of things just because they can. It's one of the most frustrating things I've run into in years.
Every language has it's detractors, and my favorites are no exception. I'm not one to throw stones, living in a glass house, but this really does seem to be something of a pandemic. Systems built with tools that almost restrict the developer from making useful, well-documented, code and easily understood code.
Unit Tests: Useless
This is the project that has the most incomplete, yet copious, set of unit tests that I've ever seen. The unit tests passed completely, but at the same time, they didn't point out why I might have absolutely no data in the report. Does that sound like a good set of unit tests? Yet, I spent the better part of two days updating these same unit tests when I added a feature that took me about an hour. What's right with that?
If you're going to have unit tests, and more than that, have integration "unit" tests, then don't you think it should cover the case where a great deal of the result is empty? If it's a coding mistake, it should be caught in the tests. If it's a data issue that would blow out that much data (like a null pointer), then again, it should be flagged as at least a warning.
No tests failed or even logged problems.
Getting to the Bottom of It All
I looked at the SVN change logs for all the files. Nothing there would make all the data disappear. I looked at the data in the database - maybe a lead there, but no, it was a false alarm. I looked at the data moving around the system at a level that wasn't easy to locate as there wasn't any documentation (as I've said), but in the end I was able to see the data coming out of the test system and the production system. Interestingly, the data at this level wasn't blank. It appeared that the Flex client was interpreting the data as something it didn't want to display.
Now we were getting somewhere.
When I looked at the data more closely I noticed that the numbers were remarkably similar. Interesting. Then I noticed that one part of the data was the wrong symbol. Very interesting, now. So I asked the original developer on this point, and he said that the symbol I had was wrong. OK, let's go there.
I had put the configuration of the system into a database so that it's far easier to maintain and basically is driven off the data that's maintained by the individual desks. When I put in these normalizing contracts I put in what I believed to be the right contracts. Turns out, for this report, for this group of instruments, it was wrong.
A one field change in the database and a restart of the Tomcat instance and Bingo! data was there in the report.
Lessons Learned
At the end of the experience I was sad to realize that I hadn't really learned a lot from this other than what I already knew: poor documentation, poor tests, poor checking on the client, the list goes on and on. While I know it's like whipping a dead horse, as long as I'm forced to work with this system it'll continue to be a source of anguish to me. It's got a lot of nice things in it, but the design and documentation aren't among them.
The code is consistent for the most part, and the goals and attempts are admirable, but in the end it's another victim of that pandemic of over-design and under-documentation. Make it simple... make it clean... work on the documentation... make it something that when another developer looks at it, it makes them code better. Set the Gold Standard for code. Be better than you have to be.
I learned those lessons years ago, and I only home some of them rub off.