Archive for the ‘Coding’ Category

Once Again, Work Stress Kills my Posts

Tuesday, November 15th, 2011

cubeLifeView.gif

This morning I noticed that once again, work, not laziness, has killed my posting here. I really hate that. One of the real joys of what I do is being able to write a little bit about it every day, but the way things have been going at The Shop, that's nearly impossible these last few weeks. It's been non-stop work on finishing up the Greek Engine project, and dealing with the fact that ultimately, we are helpless to really have the kind of system we'd like to have because the people maintaining the instrument data just don't seem to have it together.

I'm trying to believe that they are doing their best, but it's getting increasingly hard to believe that. I don't want to think badly of people, but when I come in to a database with five instruments in it - not the 400,000+ I was expecting, then it's really hard to think that they even did any testing at all on the data. I know it's a matter of expectations and abilities, and for the longest of time the expectations have been crazy low, but at some point you can't fall back to that and have to take some level of responsibility for your actions.

I can write the greatest code in the world, but what my users are going to remember is that the data coming out of it was horrible, and therefore my app was horrible, all because of bad data. There's no way around it. The Team depends on every single person doing their job. There just are no unimportant roles.

It's just heartbreaking to me to see this. Having put in the work I have to see the complete and total lack of personal responsibility in the quality of the data I'm getting. It really is just heartbreaking.

But then to add insult to the injury, I'm unable to write about it all. Unable to vent about it. I've tried and tried to make this a priority, but in the end, I know myself. When there is code to write and I have time before the train, I'm going to write it. That's my work-ethic, and there's very little I can do about it - even if I wanted to. Which I don't.

I'm going to have to try harder. I'm afraid that in the end, this is just the wrong job. Maybe it's a matter of timing. Maybe it's more fundamental than that, I don't know. I'll try to give it all the time I can to turn around, and I know there are people here really trying to turn this around. But if things don't change, I know there will come a point where I simply have to disconnect myself from this place in order to save myself.

I hope I can hold out.

Google Chrome dev 17.0.938.0 is Out

Tuesday, November 15th, 2011

Google Chrome

This morning I noticed that once again, Google Chrome dev was update - this time to 17.0.938.0 and this time, it looks like the big changes for me is the new V8 Javascript engine to 3.7.6.0 which includes the new garbage collector. The release notes post indicates that downloads might be broken for some, but thankfully, that's not the main usage I have for Chrome, so I'm safe - for now. I'm happy to see Google keep moving forward on Chrome, it's about the only thing I'm positive about when it comes to Google. The engineers aren't running the show anymore, and it pains me to see the "Do no evil" corporation do such horribly bad things.

Sigh.

But at least Chrome is going well.

Cabel’s Amazing FancyZoom 1.1

Monday, November 7th, 2011

I remember years ago, Cabel S. writing about the nice little Javascript and graphics snip called FancyZoom that allows you to easily put thumbnail images on your web site, and then with a click, the full-size image is displayed. It's really pretty amazing. I downloaded it, and started messing with it a bit. I can understand the directory structure he chose - it's common for him, but for me, it was a little different, and that's OK fine with me. I was able to easily move things around a bit, get the links right, and it fit right into Liza's home page.

I remember it being a lot smoother back when it first came out, but now that machines and browsers are so much faster, it almost flashes up. Pretty neat, in my book. So I need to run it past Liza and see what she thinks, but I'm pretty pleased with it. What I need now is a lot of Liza's artwork and then we can get to organizing it in folders, etc. and then making thumbnails of the images, and we'll be ready to go.

I'm not sure if we'll end up needing to back-end this with a real service, but if we do, there's more than enough capability in PHP, should we need it, and I can even put up a little database to hold the information for the pages.

Pretty sweet.

Google Chrome dev 17.0.928.0 is Out

Friday, November 4th, 2011

Google Chrome

As expected, Google Chrome dev 17.0.928.0 is out, and with it are a few big ticket items: the new V8 Javascript engine is at 3.6.6.3, there are additions to the incognito windows, and several changes to the linux version to speed it up. Nice work. I read in the comments to the last release that Google is trying to make the release cycle six weeks. It's something to strive for, but I think that's a little arbitrary as there's no way to know what the next six weeks will bring, and how that will impact the ability to release significant changes to the code.

More likely, it's an arbitrary scale to just "change the numbers" and the third number, 928, is really the number to watch. Still, it's nice they are trying to have something out regularly. It's a nice goal to have.

Google Chrome dev 16.0.912.21 is Out

Wednesday, November 2nd, 2011

This morning I noticed that Google Chrome dev 16.0.912.21 was out, and while I'm still expecting 17.x any day now, it's nice to see that they are still looking into the finer points of the codebase. The release notes for this version are pretty tame - nothing amazing, but it's progress nonetheless.

Documenting the Greek Engine – OmniGraffle Pro

Monday, October 31st, 2011

I have started the documentation of some of the parts of my Greek Engine, starting with our use of redis as a cache service. This documentation is written for our QA Testers, but it ultimately needs to be translated to non-technical speak for the operations and testers folks that will come along later in the lifecycle of the project. As an additional visual aid, I started with just about the only thing I had that would make Visio-like drawings: ZeusDraw. It's OK, and I've certainly done plenty in it, but it's not the same as a real layout program like Visio on Windows, or even OmniGraffle from OmniGroup.

Still… at $200 for the Pro version - that which can read/write Visio files, it's a lot for something that I don't need all that often. Still… it's a sweet looking application. Something I would really like to have, but it's hard justifying the cost for this one project.

So I finished the first version of the block diagram of the Workspace and data feeds and realized that I needed to spend the money if I really wanted to make these look nice, and useful. So it looks like I'm going to do these over in OmniGraffle Pro as soon as we load up my PayPal account with enough money to cover the $200 the app costs from the Mac App Store.

Creating a Trade Message Accumulator (cont.)

Friday, October 28th, 2011

High-Tech Greek Engine

Today I finished up the trade message accumulator component for the Greek Engine and put it into play on the feeds from the exchanges. It's something that's been needed, and I'm hoping it's going to be able to accurately track the volume on each exchange for each instrument, but I'm not really certain that the volume numbers are all that reliable. I've seen a lot of pretty wild data from the exchanges, and I know that the prices are far more important than the sizes, and I've seen enough problems in the prices to suspect that the volumes may be accurately tracked and accumulated, but they probably aren't going to yield a really superior data stream.

It should be a lot closer, but I'm not willing to bet anything on these numbers - at least not yet.

Creating a Trade Message Accumulator

Thursday, October 27th, 2011

bug.gif

I've run into a problem with my Ticker Plant feeds, and the best solution to the problem is to create a component that can fit in-line with the feeds and "promote" and "embellish" the trade messages (prints) with the current volume traded as well as per-exchange volumes traded. The issue is that if I allow these trade messages to conflate, information about the individual trade will be lost. Sure, the most recent data will survive, but the volume change represented by the conflated trade will be gone forever.

What I needed was a very simple, very lightweight, component that would take the stream of trades and aggregate the volumes - as well as the high, low, open values, and then "attach" those values to a new trade message - one that is a subclass of the original, and so can take it's place in all the processing, but also includes this cumulative volume and limit values.

Today has been spent creating this and fixing issues associated with it. There's a lot of little things to write including the serialization and deserialization of the component, and the handling of this new message type, as well as automated tests to make sure I'm accumulating things properly. It's close, and I just need to write a few more tests, but for now they day is over and I need a break.

Google Chrome dev 16.0.912.12 is Out

Wednesday, October 26th, 2011

Google Chrome

This morning Google Chrome dev 16.0.912.12 was released and I picked it up. When I went to the release notes site, I saw that Google Chrome 15.0.874.102 was released to stable which just blows me away. That means that stable and beta are both 15.x.x.x and dev is on 16.x.x.x - that's not going to stay that way for long. I'm guessing dev is jumping to 17.x.x.x pretty soon. Additionally, there were no release notes for 16.0.912.12 at the time the code was available. So maybe it's coming sooner than later.

Always interesting times.

[10/28] UPDATE: don't blink - they just released 16.0.912.15 with typically sparse release notes. That's less than a day after the last release. Yup… it's about to jump to 17.* soon…

Creating a Solid, Reliable C++ Wrapper for hiredis Library

Tuesday, October 25th, 2011

Redis Database

Most of today has been spent trying to get my simple C++ wrapper around the hiredis C library for redis working in a way that allows for a significantly more robust usage pattern than I originally had. Specifically, all was fine until I shut off the redis server, and then my client would try to recover and reconnect and end up dumping core. The problems are only made worse by the fact that I really had no support docs on the hiredis site - only the source code, which is optimistic in the extreme. No argument checks, etc. make it ripe for a problem if it's not used exactly right.

Clearly, I wasn't using it exactly right, and those misusage patterns were what was causing the code dumps. So the first thing was to track down what I was doing wrong, and that meant that I really needed to become much more familiar with the hiredis source code. To be fair, it's a decent open source library, but it's missing so much that would have added so little to the runtime load and would have made it far more robust to the kinds of misusage patterns I had in place. After all, my code worked, so it's not that it was totally wrong, it's just that when things started to go badly, the things that you needed to do become far more important than when things are going well.

For example, if I wanted to send multiple commands to the redis server at once, you can run several redisAppendCommand() calls, but each really needs to be checked for it's return value. This isn't clear in the code, but it's very important in the actual system. Then there's the calls to redisGetReply() - typically one for each call to redisAppendCommand() - but not always. Again, you need to check for the critical REDIS_ERR_IO error that indicates that the redis context (connection object) is now so far gone that it has to be abandoned.

Then there's the reconnection logic. It's not horrible, but you have to be careful that you don't pass in any NULLs. There simply is no checking on the hiredis code to ensure that NULL arguments are skipped. It's simple to do, but it's not there - not at all.

In the end, I got something working, but it was hours of code dissection and gdb work to figure out what was going wrong and what needed to be done to handle the disconnected server and then the proper reconnection. Not fun, and several times I was wondering if it just wouldn't be easier to write my own as it's all TCP/telnet based anyway… but I kept going and in the end I have something that's reliable and solid. But it was nasty to get here.