Archive for the ‘Coding’ Category

Moving Work Development Boxes

Friday, November 6th, 2009

Today I finally had time to finish my move to the new development 'server' box in the server-room. It's a little 4-core 1U box, but it's what the "standard" development server is. So be it. I needed access to more memory than my desktop would hold, and this guy is capable of holding 192GB - which is nice, but it'll start with 32GB and see how we go from there.

Since I have no root access, I have to check a few things, ask a few things to be changed, and repeat until everything is set up as I need it. Not exactly hard, but it's detail work that needs to be done right so that should I need to build another box, they'll be able to use this as a template and stamp them out quite nicely.

After I got done with the move, I made a few much needed GUI changes to my web app. The fixed calendar was just taking up too much room and it had to go. Thankfully, it's also easily made into a pop-up calendar, with an INPUT tag associated with the value. It took a little messing with divs, but I got something that looks pretty nice, and now there's a lot more room on the page for new GUI features. Not bad.

Some Days It Seems All Your Work is for Nothing

Thursday, November 5th, 2009

Today has been a really hard day. I've worked all day trying to get the speed of this one process hitting a legacy Windows app up enough to make the users happy enough so that they don't ask for features cut to get the speed where they need it to be. I totally understand why they need the speed. I also know what's causing it to be slower - it's the additional features, but there has to be a balance between the two - for there are people that the new features are the more important part, and others that the speed is the more important part.

So I'm trying to get speed and features, but today was very disheartening.

For ten full hours I tried to get the speed up. I tried everything I could think of, and then thought up new ideas, and tried them. I was sure I could find the speed - but I couldn't. I ended the day with the same cycle times as I started with. Sure, they were a lot better than the production values, but that was hardware. Run the process on faster hardware and you get faster times.

Obvious, but it's all I had.

That's what we'll run with until we get even faster hardware for this guy. It's not really rocket science, but it's more than a little frustrating. But like so much in life, there's nothing I can do about it past what I've already done.

Lots More Performance Metrics – Very Little Gain

Wednesday, November 4th, 2009

Today I spent a lot of time trying to figure out what the problem was with the hardware for this little feeder app to this larger webapp. I started with a desktop box, and got it set up and running. The times were nice, but still about twice what it was running on my desktop. So that spun off a big investigation as to why seemingly identical boxes are off by a factor of two.

After that for a bit, I was given a Windows Server to check. It was fine, but the times were slower - primarily due to slower processors in the server. Still, it was nice to see that I was able to run the application on Windows Server as opposed to just XP. But in the end, "nice, but no good", I was still a long way away from the marks I'd taken using my desktop.

When I went back to the desktop to try and account for the factor of 2, I noticed the network card was running at 1/10th that of my desktop, so I called in the network support guys to ask to have it switched up to 1Gbps. Turns out, they can't start until 4:15 pm. Yum.

I wish this was more clear-cut. I have something, but not enough. Have to keep at it tomorrow.

BBEdit 9.3 is Out

Wednesday, November 4th, 2009

Today I got a tweet about a new release of BBEdit being out - 9.3. The release notes are extensive and shows this to be a significant update to the system. Wow. Pretty impressive.

I'm not sure about a lot of the new stuff, but the "maketags" argument is great as it means I don't have to worry about making my own ctags file. Very nice of them.

Chasing Elusive Performance Bottlenecks

Tuesday, November 3rd, 2009

Today I've spent a lot of time trying to find performance where there was no guarantee that it should be found. I had a strong feeling that there had to be a better way, but I had no proof. I have this application that needs to talk to a legacy system through a DataSource connector (yeah, Windows) to get data out and into the app I inherited. Doesn't sound too hard. Yeah, right.

Well, this is complicated by the fact that there are a few ways to get at this data, and I was looking at different access schemes and thinking "If I could get the speed Excel has in getting the data, I'd be OK". So it became a quest for how Excel did it, and matching it's speed.

I started with the existing code - using an OleDbAdapter. This was taking a horribly long 130 sec. to get the data that Excel could return on my desktop box in about 12 sec. That factor of ten was killing me. I needed to run it on the VMs, but hey, they're supposed to be virtual machines and match the performance of my desktop - right?

So I tried an ODBC connector. That didn't get me anywhere because the methods to populate the .NET DataTable weren't implemented in the ODBC driver. After I passed this back to the developer that pointed me in this direction, he agreed that yeah, this wouldn't work. Nice.

Then I tried a more "bare bones" approach - getting the data at a low-level array and then pulling that apart to make a DataTable out of it. WHen I got that one into test, I realized that the time was even worse. Not the right direction to go in.

So finally I started running all the tests on all the machines I had at my disposal: my desktop, my Test VM, and the Production VM. The results were shocking.

Machine Excel
Query
Code
Query
Desktop 12 s 11 s
Test VM 20 s 75 s
Prod VM 26 s 130 s

so while I might see that the difference between a VM and a desktop is somewhat fixed for the Excel query, it's hardly fixed for the programmatic query.

Why the difference?

My best explanation is that VMs are horrible for anything that needs to use a disk. The two VMs have different disk subsystems and neither is really good. We are getting a new SAN, but that's going to be a few weeks away, still. The performance should then even out as all the VMs will be moved to the new SAN, but until then, we're stuck with what we have.

Additionally, the desktop seems to have consistent performance in either case. While this seems logical, it also seems to point to the fact that the Excel query is much less memory-intensive. I can see it pulling in the data in a stream. The code query has to get it all in a DataTable and then process it. I can see this being a big difference in the table sizes we're looking at.

What I'm going to try tomorrow is to get a plain desktop and try to set it up as the compute box to replace the production VM. This is not going to make me any friends in the support group as they hate having production apps on desktop boxes, but hey... this is an order of magnitude difference. We can try to get XP-compatible servers and replace the desktop with a server-room solution when we can get it in, racked and powered. But for now, the users are screaming and I need to give them a solution.

Amazing how things work out.

The Ups and Downs of Programming

Monday, November 2nd, 2009

Today has been a real up-and-down day. It started out very nicely when I was able to come back in after a day off helping friends in Indy and see that the work I'd done for the persistent state of the alerts I'd been working on last week was really rather good. I had a few typos in the code, and one logical mistake of not returning the value from a map look-up, but that was it. I was really very happy with the way it was working out.

But that happiness didn't last long.

I got an upset call from one of the bigger users and they were very unhappy about the state of the performance of the updates from the system I inherited. There's not a lot I could think to do about the issue, but I had to dig in and give it a go.

What ensued was a day long realization that not a single soul had solid understanding of several key components of data-access code that I was forced to use. It was frustrating to say the least, and because I made very little headway today, I know I'm not going to be out from under it tomorrow.

Yeah, it went from great to crappy in no-time flat.

Development Isn’t Always Glamorous Work

Wednesday, October 28th, 2009

Today I was doing a lot of little things to my web app. Nothing really amazing, but in the end, they added an amazing bit of functionality to the alerting system. I added in the thresholds for each desk so that they can be used as variables in the alerts which means that we can make the system a lot more proactive than it is now when it comes to desks getting near their limits. Adding the variables to the code was easy, and not really amazing in and of itself, but the larger effect is pretty nice.

Anyway... lots of work and not a lot of interesting things to show for it.

Interesting Behavior of the Google Visualization Table

Tuesday, October 27th, 2009

GoogleVisualization.jpg

I had to do some work today with the Google Visualization Table and no matter how I set it up, if I had more rows in the table than would fit in the vertical space allowed in the enclosing div, we'd have a vertical scrollbar and a horizontal one.

I kept fiddling with the width of the div and the width in the table's parameters, and nothing seemed to make any difference. Then I started to see a pattern: the horizontal scroll bar was there only when the vertical scrollbar was - and it was always scrolling only the width of the vertical scrollbar.

It was as if the vertical scrollbar's width was not being taken into account in the sizing of the table, and therefore, a horizontal scrollbar was needed to display it all.

So how to fix it?

Not so easy, it seems.

It's a known bug by Google, but they have not yet released a fix. Nice.

I started playing around with the width parameter in the table's params:

  var tableParams = { showRowNumber: false,
                      allowHtml: true,
                      width: '100%',
                      cssClassName: tableStyles };

and when I used the percentage width parameter I started to see that the calculations of the width of the table were done properly. So, by setting it to '100%', I was able to get the maximum width while also getting the vertical scrollbar when I needed it, and not getting the horizontal scrollbar when I didn't.

Wild, that I needed to set the width in that way.

Amazon RDS – MySQL in the Cloud

Tuesday, October 27th, 2009

MySQL.jpgThis is an interesting development. The Amazon S3 system has been online for quite a while, and clients like Transmit and Cyberduck already talk to S3. But it's an object-database. Nice in many regards, but not exactly what someone building an application might be looking for. There are just way too many applications that need a general, reliable, relational database - like MySQL.

While I haven't used MySQL a lot, it's pretty much on par with the other open source databases - DB2, PostgreSQL, etc. It's a filesystem database where it's easy to backup the database by simply copying a few files. Nice, compact, and pretty good performance from what I've heard. I've used it in WordPress installs in the past, and it has always worked very well for me.

What is interesting is that Amazon seems to have broken down the costs so that you really are only paying for what you use. Be it CPU cycles to do the queries, or I/O to get the data to/from you machines. It's a pretty wild idea. If you couple this with the EC2 machines where you can bring up a linux box with your software on it, run whatever it is you need hitting this MySQL database with all the data your app(s) need. They even have a reduced I/O rate for transferring between their servers so that if you really can run the application on their end and have a minimal interaction with it, you're liable to be able to get away with a very inexpensive I/O cost.

Of course, this means that you're paying for the CPU cycles on their end... there is no free lunch. So it's an interesting option. If you need to have a cloud MySQL database, this is the only one I've seen. But the question becomes, do you need it? That's tougher to answer.

SubEthaEdit 3.5.1 is Out

Tuesday, October 27th, 2009

I was thinking about Coda and the editor it uses - SubEthaEdit. So I fired it up just to see if they had released a new version of the base editor, and indeed they have. SubEthaEdit 3.5.1 has quite a list of features and fixes including several Snow Leopard fixes. Fantastic news.