Chasing Elusive Performance Bottlenecks

Today I've spent a lot of time trying to find performance where there was no guarantee that it should be found. I had a strong feeling that there had to be a better way, but I had no proof. I have this application that needs to talk to a legacy system through a DataSource connector (yeah, Windows) to get data out and into the app I inherited. Doesn't sound too hard. Yeah, right.

Well, this is complicated by the fact that there are a few ways to get at this data, and I was looking at different access schemes and thinking "If I could get the speed Excel has in getting the data, I'd be OK". So it became a quest for how Excel did it, and matching it's speed.

I started with the existing code - using an OleDbAdapter. This was taking a horribly long 130 sec. to get the data that Excel could return on my desktop box in about 12 sec. That factor of ten was killing me. I needed to run it on the VMs, but hey, they're supposed to be virtual machines and match the performance of my desktop - right?

So I tried an ODBC connector. That didn't get me anywhere because the methods to populate the .NET DataTable weren't implemented in the ODBC driver. After I passed this back to the developer that pointed me in this direction, he agreed that yeah, this wouldn't work. Nice.

Then I tried a more "bare bones" approach - getting the data at a low-level array and then pulling that apart to make a DataTable out of it. WHen I got that one into test, I realized that the time was even worse. Not the right direction to go in.

So finally I started running all the tests on all the machines I had at my disposal: my desktop, my Test VM, and the Production VM. The results were shocking.

Machine Excel
Query
Code
Query
Desktop 12 s 11 s
Test VM 20 s 75 s
Prod VM 26 s 130 s

so while I might see that the difference between a VM and a desktop is somewhat fixed for the Excel query, it's hardly fixed for the programmatic query.

Why the difference?

My best explanation is that VMs are horrible for anything that needs to use a disk. The two VMs have different disk subsystems and neither is really good. We are getting a new SAN, but that's going to be a few weeks away, still. The performance should then even out as all the VMs will be moved to the new SAN, but until then, we're stuck with what we have.

Additionally, the desktop seems to have consistent performance in either case. While this seems logical, it also seems to point to the fact that the Excel query is much less memory-intensive. I can see it pulling in the data in a stream. The code query has to get it all in a DataTable and then process it. I can see this being a big difference in the table sizes we're looking at.

What I'm going to try tomorrow is to get a plain desktop and try to set it up as the compute box to replace the production VM. This is not going to make me any friends in the support group as they hate having production apps on desktop boxes, but hey... this is an order of magnitude difference. We can try to get XP-compatible servers and replace the desktop with a server-room solution when we can get it in, racked and powered. But for now, the users are screaming and I need to give them a solution.

Amazing how things work out.