Archive for December, 2007

XML and the Illusion of System Stability

Tuesday, December 11th, 2007

I've been using XML for quite a while. I can see where it's useful, and more often than not, I can see applications of it that are horrible overuse of XML for the sake of using XML. For example, I can see XML as a configuration file with less than 100 lines in it. I have also seen XML as the transport for data streams of thousands of points. The first is a decent use of the format, the latter is stupid, plain and simple.

So I find myself again today working on a project that I haven't touched for months. It's got a ton of XML configuration - literally dozens of files and thousands of lines - some files having several thousand lines of XML each. I've heard people who are big proponents of XML say "You can have code stability while configuring your app in XML". Well... that's a true statement, but you could replace 'XML' in that statement with just about anything else - including simple key/value text files. But what seems to be the trend is that the XML files actually configure the system - hook up this feed to that processor, and add this dialog box at this point, etc. I've seen this all the way to the point that XML changes alone were responsible for adding completely new functionality to the app.

At this point, the XML is really the code. You can't call the framework of interpreting the XML and following it's rules of wiring things together the core, immutable, part of the system - and the only part needing source control, test cases, and other "project-level" attributes of a codebase, when the way in which the application works is really dictated by the XML. It's like calling the JVM the code and the compiled byte-code the "config files". Nope, baby... that's a runtime and that's the code it executes.

So don't bother saying XML is wonderful... it's OK. It's not alone, and it's not even the best implementation of the self-describing markup language. But it has it's uses. Unfortunately, I've seen far more abuses than valid uses of XML. I'm beginning to think they simply don't exist.

Venting Can Be Nice – Unless You’re the Target

Friday, December 7th, 2007

I was just talking to a co-worker about something totally unrelated - I think it was the history of a project here at the Shop and who's done work on it over the years... when it's changed platforms, who's done what, etc. Sort of fun because one of the guys new to the group hadn't heard this, and it had come up recently as needing some work done to it. While we were talking about this, my co-worker brought up something that had happened years ago, and didn't paint me in a very flattering light.

He had asked for a feature to be added to something I had built on top of a commercial package to make using this pack a lot easier. Anyway, this one feature he wanted added wasn't natively supported in the commercial offering. I'm sure at the time I was trying to make it easy to do those things that the package did natively, and not build in a lot of functionality that wasn't part of the original concept of the package.

Well... at the time, he didn't buy this, dug into the code and saw that it could be done - not supported natively, but possible. He tells me now, he knew then that I was bs-ing (trying to be family-safe here) him and wanted to call me on it. He went to our boss, told him he knew I was bs-ing him and asked if they should push me on it. Boss said "No, let it go", and it dropped.

Later, when someone wanted to move an application to the web using this package I added this exact feature - not because anyone directly asked, but because if we were going to migrate this one app and kill it, we needed to have this functionality somewhere else. So I did it. It was more of an add-on than I'd done in the past, and meant that the use of this feature made the code much less efficient, but it did work.

And for this, he's been carrying around this on his chest for years. He got the chance today to get it off his chest and put it squarely on mine. So, was I bs-ing?

Looking back, I know I didn't think I was - then. Now I'm not so sure what my motives were. Maybe it was as I remember it - an attempt to be a thin shell on the package to make it easier to use the native features. Maybe I just didn't want to do it so I didn't look for the way to do it at the initial time.

It makes me think about what I'm doing today. I know I've turned down requests to put certain things in my code because I felt they didn't belong there. I've suggested where they might go, and encouraged others to put them there. I've even offered to let them fork my work and add what they wanted to the fork they made. But I will stand firm when I think the idea is not right. That's just good design and appreciation that not everything belongs in a String class, for instance. You create horrible maintenance problems if you keep throwing things on a cleanly designed set of classes.

Still... it causes me to pause and think. Maybe I was too harsh. Maybe I need to give it more time. I'd like to think I'm a better person because of this, but it still feels like I just got called to the principal's office.

Falling into a Great Position

Friday, December 7th, 2007

cubeLifeView.gif

Nothing is perfect. Period.

Sometimes we're lucky, and some times we make our own luck. Sometimes.

I was just sitting here thinking how much I really enjoy the position I have today. I really do enjoy this industry, not that I'd have ever thought I would - from the outside looking in, but once in (thanks to a good friend) I have found it to be one of really interesting people, very challenging problems and more than enough variation to keep me interested for more than a decade now. Pretty lucky, I'd say.

It's funny to me to see how the most captivating parts of this job are the ones that take me back to the work I did in grad school - numbers. Data. Lots of it. The market data is very interesting to me. Getting it from the provider, making it faster, better, easier to use - that's a blast. Getting it into a simulation/analytic engine is also a ton of fun. Seeing all the numbers roll around from the market data all the way through the various systems to show up finally at the client is a lot of fun. Number plumber - that's me. Kick in the pants.

It's important to realize that after all the work I've done that has nothing to do with my degree, it's neat to realize that I really have come back to my 'roots'. I guess you gravitate to those things you really love doing. Sweet.

Solid Progress on the Data Source Migration

Thursday, December 6th, 2007

Today I spent a lot of time on migrating off the data source that we're trying to move off of, and on to the one that's got historical pricing data that we can use without the associated rules and regulations of the original source. This meant doing a lot in the market data server to make this other data source look as close to the original as possible, while still taking advantage of the few things it was doing better than the original.

One of the more interesting things is the server's respecting the field names that are used by the client in it's requests. If a user is using the Perl interface, for instance, and they ask for 'close' or 'Close', they don't want to get data back that has the field name of 'CLOSE'. That makes it hard for them to match up what they sent with what they received.

True, we could make it so that they were required to ask for 'CLOSE', and then all would be OK, but that's a little too restrictive for me. So I had to go into the server and fix up the way this data provider was handling the field names. Then there were the changes that needed to be made to the cache to hold the returned data - again, 'Close', 'close', and 'CLOSE' all should cache to the same values, so we had to put in case-independence in the code there as well.

In the end, I got all the pieces working and was able to move two more things off the old provider and onto Fusion. It's now up to all the other users of the old data provider to change their code and migrate from the old to the new. It's not going to make a lot of people happy, but then again, there's no way they were going to pay $1 million to get this data. No way at all.

The Seemingly Ever-Changing Views of Management

Wednesday, December 5th, 2007

cubeLifeView.gif

In the recent weeks one of our data vendors has contacted us about our use of their API. Specifically, they created this API for themselves, as their product runs on it, but they wanted to make it possible for users of their product to also get at this data in a programmatic manner. It was a nice thought. We looked at this and decided that it would be even nicer if the general users of this data didn't have to individually write to the API. Maybe even add a cache to the data received so that for each day, a single piece of data would only be fetched from the provider once - then after that, subsequent requests would hit the local cache. It's an excellent idea, a wonderful product, and now they are interested in charging us more than $1,000,000/year because of the data we get from this.

The reason for this is that this vendor decided to un-bundle the API from the product they sell that uses this API. They decided this because, of course, people were using it and they saw it as an additional revenue stream for their company. I can't blame them, it's capitalism as it's best - make something someone wants and then charge for it... if they pay, then you have a winner, if not, then you are out the work to make the product. So they came to us and gave us a pricing model for the data we're getting and it averages out to a little over a million dollars a year.

Given that the bundled product we got originally is about $2,700/month - this represents a large increase, and it's likely that the business will simply not see that this data is worth seven-figures a year. Again, that's their job - cost/benefit analysis... it's a necessary part of all businesses.

The problem comes in that it's not this easy - the vendor isn't just offering us an upgrade path, they are also (at the same time), holding a gun to our heads. If any of the trading business requires this data to function (as some do), then turning it off is simply not an option. Migration is the key, but migration to what? We'll have to look into the different competitors of data and look to see what each has and the cost, etc. Then work that into the systems one by one and then we'll be ready to turn off the high-priced original vendor. It's more than an afternoon of work.

The other side of the problem is that the management here has known about this issue for years. Yes, years. Each time they talk down the vendor from the ledge, and they agree to let us continue to use the data under the existing contract. When we start to get too much data from them, we find alternatives and migrate a few applications from that one source to another under controlled conditions and hitting the easiest, biggest offenders first. Makes a lot of sense. Usage of the data drops, they are happy, and life goes on.

But not now. No, they are convinced that the vendor is out to get the full price and lock down all the data. The problem with the lock-down stance is that there's no way to really do it unless we shut down all usage of the data. Imagine, I use the data in a spreadsheet. I save that spreadsheet on a network drive that others look at. That's redistribution of the data and they won't allow that. The only people that can see the data are the people they license to see the data - no exceptions. And if there's a possibility that someone else might see the data, then that has to be shut down. Their latest position is very restrictive, and to follow it to the letter will mean you really can't use the data for anything "saved" in any way, shape or form.

Clearly, we need a new source for data, but management is taking the attitude that we need to "spread out" the usage and then pay for what we really use - sticking to the letter of the license. But there won't be any data that we can use to the letter of the license as it's a server, and therefore might serve up data to something that possibly makes it available to someone not intended to see it.

It's dumb, but I'm not going to fight it. If they want to shut it down, fine. They aren't going to - not without a replacement, and when that replacement arrives we'll simply slot it into the server alongside the old, restrictive one. It's going to be a mess, but they seem to have given up on talking to the vendor on this. Maybe it's not possible, but I can't imagine a vendor that would take nothing over something - which is what they'll get if we shut this down. But then again, they could be trying to shut it down on their end, and this is their way of doing it. Vendors have been known to do dumber things.

Nobody Likes Making Mistakes

Wednesday, December 5th, 2007

Last night I had a release of the server and in the release package I thought the shell scripts were included. Now, it's obvious what's happened, but I really thought they were included. Then, while walking to the train, and with no hopes of going back and fixing it without being home late, I went ahead and got on the train. Then I got the call. Yup, what I had thought about was the case - the deployment package did not include the shell scripts and there was a critical path change in this release. Crud!

It was easy enough to fix when I got home, but the problem was that the opening greeks were going to be missing because of the failures in the calculation nodes. So I had to reload all the underlyings with positioned options. This took about 40 mins, but could be done without having to take the server down. Since the Hong Kong day was already in full swing this was the far better idea.

As I was doing this by hand, I was thinking that it would be nice to have a little program that would do this for me. Well, this morning when I came in I started doing a little digging and sure enough, I had written a tool to do just this many moons ago. I had just forgotten about it until I had the time to dig into the possibility. Next time, I'll remember.

My point is that while I know it just makes me look more human (so says my wife) it really is terribly embarrassing to make mistakes like this. Sure, it was easily fixed and I fixed it, but the fact that I made the mistake was what bugs me. Deep down, I knew there was a problem because it came to me like a flash on the way to the train. So there was something there trying to tell me what I was doing, but I was sure it was going to be OK.

Wishful thinking.

I need to lighten up a bit. Everyone makes mistakes and I have no problem forgiving others, I just have a hard time forgiving myself. I remember a scene from My Favorite Year when Swann (Peter O'Toole) was yelling to Benjy (Mark Linn-Baker) about going on TV live - "I'm not an actor! I'm a moviestar!" Benjy gets angry at Swann for starting to walk out and says something about how Swann has always been his hero, and heros never walk out. Swann is upset with the responsibility this places on his, as he really likes Benjy, and reacts badly by saying something to the effect of "I'm not that person! I'm just a person.". Benjy responds with the best lines in the show: "I can't use my Alan Swanns life-sized. I need them as Big as I can get them." And then the clincher: "Oh... and by the way, No one is that good an actor." Swann comes around and saves the day. In the end, seeing that the person Benjy sees in him is really there, if he just believed in himself a bit more.

I feel I need to be better and not makes those mistakes that are so easy to make. I don't want to be just ordinary. Man... the baggage we carry around from our childhoods. I've certainly got my share, and it comes out when I make mistakes like this. Double-Crud!

Getting The Curse on the Run

Tuesday, December 4th, 2007

cubeLifeView.gif

Things are looking a lot better today with regards to the horrible application that I was working with again today. This is the one where there are empty tables, interesting method names, and nothing in general telling us what's really going on. Today I was able to successfully track down the problem to an ancient data access layer by proving that this application requested the instrument data, but never got anything back from the service. This would normally be a simple matter of looking at the database access for the loading of the data from the appropriate database, but no... that would be far too easy.

I had to find the place where the query for this service was being built, and then in an entirely different directory (library) of this app, the data was being read in and processed. It's amazing that anyone actually understood what was going on here. Maybe there were code generation tools in the initial versions, but there aren't any now, and it's an amazing lump of horrible code to try and find out something as simple as what was asked for and what was returned.

In the end, I'm confident that it's a data problem in the database that's used to return the information for these calls. What exactly is wrong, I have no idea, but there's a group that does this all day long, and while they don't necessarily have any better idea about this than I do, at least they have time to invest in getting it tracked down. I need to be doing other things, and not spending more time on tracking down a data problem.

New Coda Released

Tuesday, December 4th, 2007

Coda.jpg

While I don't do a ton of web coding, I have to say that when I do it, it's nice to have Coda around, and just yesterday they released an update to v1.1 - the move to Leopard. The changes seem to be across the board - GUI changes to match the new UI of Leopard... engine changes for a lot of the things like CSS, etc. ... fixed a few bugs, added a few features. Not bad.

Every time I use Coda, I keep wishing there was more web coding to do. Fact is, it's a joy to use and I just like using it. I can't image a better recommendation than that. It works, and it works well. It looks great too. Super.

The Curse Arises Once Again

Monday, December 3rd, 2007

cubeLifeView.gif

Today I spent most of my day dealing with a horrible system that has been folded, spindled, and mutilated far more than anything I've ever seen. This code has methods like:

    void klugeToGetAroundAnotherFatAssBug( ... );
    void klugeToGetAroundAnotherFatAssBug_node( ... );

and I'm not kidding, either. The old coders of this mess didn't think enough of the even older coders of this mess to write decent methods and just fix things. No, they had to make it personal. This is what I'm walking into. I really hate it.

Anyway, today I had to try and track down a data problem that was effecting just one instrument in the application. I looked in the tables in one of the databases it uses, and found the table the data should be in. Nope... it's empty. Nice try. But then why keep the table there? Just to mess with people's heads?

Then it hit me - it's The Curse. Good, decent programmers have to work on this code and it is such an incredible burden to them that they go insane and go work in flower shops, or pump gas for a living. It's painful, but it's real.

I want to get this solved, and then document a few things about this mess and then move on. It's not going to be easy, and it's certainly no fun, but it's what I do.