Archive for the ‘Cube Life’ Category

Added Investigative Tool to Metrics App

Wednesday, January 23rd, 2013

WebDevel.jpg

This morning, with the right help, I was able to put up a slick little javascript JSON viewer on the main 'metrics' web page for the project I'm on at The Shop. The goal of this is really a quick way to allow folks - both developers and support folks, to look at the details of the Demand objects in the demand service. Since each demand has a UUID, it's possible to look at the series of the demand objects as it "flows" through the service and is adjusted, combined, and delivered to the client.

I have to say that I'm very impressed with the speed of the clojure service. Then again, it's java, and there's not all that much to it - there's the indexed postgres table, a select statement, and then formatting. Sure, that's a simplification, but it's not like we're delivering streaming video or something. But wow! It's fast.

And the regex-based JSON classifier in javascript is pretty impressive as well. It's fast and clean and the CSS is a lot of fun to play with in the syntax highlighting. I can really see how much fun this can be. Big time sink, if I played with it too much.

But it's nice to punch it out and make it available on the much improved unicorn/nginx platform. Those timeouts are gone, and the code is much simpler, and that's a wonderful thing.

Slick Little JSON Viewer

Wednesday, January 23rd, 2013

WebDevel.jpg

Late yesterday I found a slick little javascript-based JSON formatter and syntax highlighter. What I was looking for was a simple and easy way to get the JSON data out of the demand service and display it to the user. The point was to have a clean way for the users to be able to see the low-level data for a specific demand in a nice way so that they can look into problems without having to go to a developer or support person.

The project manager said that we didn't need to have anything fancy - just display the JSON data for the object as it's reasonably self-documenting, and most of the folks could figure out what they needed from that. For the more serious use-cases, we're still going to need to have a more comprehensive dashboard, but for now, this would be something we could whip up pretty easily and solve 90% of the need.

The code was really amazingly simple:

I have the classifier and then the CSS for the individual classes. Then I just have to hit the server based on the demand ID and show the results. Very nice and clean.

Pretty stylish, too! I'm impressed.

Getting the Right Kind of Help Really Matters

Tuesday, January 22nd, 2013

I've been having issues with timeouts on a web app that displays certain metrics for the main project I'm on at The Shop. The set-up is that we have a Sinatra app - using the Unicorn server behind an nginx server that's a gateway for the service. I didn't set this up, so I can't really say Why we have the nginx server in front of the unicorn server, as the point of the unicorn server is to handle the load nicely, and it's all on one box, so we're not getting redirection, but for some reason we have this set-up, and we're getting these timeouts.

Initially, I tried to have an EventMachine timer that sent a NULL to the client every 10 sec. This appeared to work, but then we started having issues with the thread it was in dieing. Then we had to put in a check-with-restart on each call. Things clearly weren't getting any simpler, and then I decided to try to convert everything to jruby from MRI, as we really didn't have any good answers as to how this was being deployed and configured. There were just too many holes.

So I reached out to one of the guys that was supposed to be the architect of this deployment scheme. I didn't know who it was, but when I found out I was really quite pleased - it was a guy that I've been interviewing with several times over the course of the last few months at The Shop.

He explained how the deployment was done, where the config and init.d scripts were generated, how to modify their generation, and in short, how to get rid of the timeouts by simply changing a few of the timeouts in the unicorn and nginx servers. So I took his advice, stripped out the EventMachine stuff from the code and then gave it a whirl.

Worked like a champ.

What a difference getting the right help makes. Inside of about an hour I had all the information I needed and had a solution that was cleaner, easier, and more maintainable than I had started with. The problem seems simple: find help, get it, and solve the problem. But too many folks - including me, initially, didn't do that.

Shame on me.

Great Languages Hurt by Horrible Communities

Tuesday, January 22nd, 2013

Sad State of Affairs

I was talking to a very good friend the other day, and we were chatting about what I thought of the Ruby world, and even the Clojure world, and I had to say that what I've seen of the two languages, and jruby as an implementation, is very nice - but the communities that sprout up around these two languages are really far more destructive than I've ever seen in my 30+ years of development experience. The ruby community especially is almost toxic in it's adoption of the "Lazy Coder". Don't get me wrong, I'm all for the Magic of the Gem, but there's a point when a library - be it boost or a gem, has to expose what it's doing so that the user can see if it's worth using in their implementation.

What I see most often in the tubists I work with is a complete blindness to what's really happening in the gems until such time as they are tracking down a bug, and trace it to one of these gems. Then they either abandon it for another gem, and re-work their code, or they fork it, make a pull request, and carry on.

I think the latter is admirable, but I think far too little thought is put into the code, and therefore the gems, before a bug hunt is underway. This means that if things work - and in most cases, these are simplistic web sites/services so they don't focus on edge cases, and "production" means "usually up, and mostly working", so things typically work with these gems, and the truth of the cost of these libraries is totally unknown to the user.

If I were doing this for fun, that'd be OK. I'd live with it because it's only a "fun project", but when I'm getting paid for this work, I know that performance matters. Heck, everything matters - from the documentation to the runtimes, to the maintainability, to the ease of deployment. It all matters. So you can't pretend that dropping a gem into your app is magically going to do anything. It's going to extract something for it's service, and that something you need to know about.

But that's the rubyist way… to have "magic" gems. You drop this in, and by convention you have a named method/function and all the rest is done for you. I can really appreciate that - JavaBeans was all about that. But it's how it's taken and abused that makes me shake my head. Use it, but understand what it's really doing. Then you can know when it's no longer the appropriate tool for the job.

Anyway… this is never going to change. In fact, I'm guessing it's only going to get worse with time. I'm an old-timer now. A dinosaur that looks to most like I'm more interested in using stone knives and bear skins, but I tell you this - lack of real understanding is the true source of bugs.

Running Tests, Fixing Issues, Moving Forward

Monday, January 21st, 2013

Dark Magic Demand

Today I spent a lot of time with the new calculation chain reaction in the demand service trying to make sure that everything was running as it should. There were a few issues with the updates when there wasn't an updating component - like a demand set without a seasonality set to adjust it. In these cases, we just did nothing but log an error, when the right thing was to realize that a seasonal adjustment without any factors is just a no-op, and to return the original data as the "adjusted" data. Easy.

But the guy that wrote it doesn't think like that. So I had to put in functions to return the empty set if there is nothing in the database with which to adjust the data. It's hokey, but hey, this entire project is like that. I'm not saying it's not valuable in some sense, but I'm looking at this and thinking that we picked a language, and a team (including me) that has no real traction here at The Shop, for the sake of what? Exposure? Coolness? Are we really moving in this direction? Or is it just another fad to pacify the Code Monkeys that want to play with new toys? Will be be moving to the next new toy when they get bored with this one?

Anyway… I've been fixing things up, and realizing that I'm getting to be a decent clojure developer. I'm not really good - let alone really good, but it's a start, and I don't have to hit the docs every 5 mins to figure something out. It's starting to make sense, even if it isn't the way that my co-worker might like it.

Thankfully, things are really working out well. By the end of the day I had updated the code in the main pipeline app to use either form of the demand coming out of the service so that we can have a much improved demand forecasting impact in the app. Very nice to see.

I have to wait a day to put it in UAT just to make sure things have settled out on other fronts, and then we can isolate what the changes are due to this effect. But it's progress, and that's good to see.

Using sendmail on OS X 10.8

Monday, January 21st, 2013

JRuby

This afternoon I was trying to deploy out jruby project to UAT, and I got the following error:

  sendmail: fatal: chdir /Library/Server/Mail/Data/spool:
      No such file or directory

and I was immediately saddened by the development. What's happening is yet another of the rubyists shortcuts and magic gems - they wanted to have emailing from within ruby, and rather than make sure there's a decent, workable, SMTP gem - which there has to be, or there should be because of how easy it is to write, they went with the first thing they saw, and it uses sendmail.

Now I don't have anything against sendmail, but it's the completely wrong tool for the job. They have had to put in a user's name - so all emails seem to come from one person, as opposed to the person doing the activity. It's just a piece of junk, and for good reason - it's the wrong tool for the job!

But I have to make my laptop work with this. It's not running by default, and the reason is that it's the wrong tool for the job, but that's something I'll take up with them another day. Thankfully, we have a solution:

  sudo mkdir -p /Library/Server/Mail/Data/spool
  sudo gzip /usr/share/man/man1/{postalias.1,postcat.1,postconf.1,postdrop.1, \
          postfix.1,postkick.1,postlock.1,postlog.1,postmap.1,postmulti.1, \
          postqueue.1,postsuper.1,sendmail.1}
  sudo gzip /usr/share/man/man5/{access.5,aliases.5,bounce.5,canonical.5, \
          cidr_table.5,generic.5,header_checks.5,ldap_table.5,master.5, \
          mysql_table.5,nisplus_table.5,pcre_table.5,pgsql_table.5,postconf.5, \
          postfix-wrapper.5,regexp_table.5,relocated.5,tcp_table.5,transport.5, \
          virtual.5}
  sudo gzip /usr/share/man/man8/{anvil.8,bounce.8,cleanup.8,discard.8,error.8, \
          flush.8,local.8,master.8,oqmgr.8,pickup.8,pipe.8,proxymap.8,qmgr.8, \
          qmqpd.8,scache.8,showq.8,smtp.8,smtpd.8,spawn.8,tlsmgr.8, \
          trivial-rewrite.8,verify.8,virtual.8}
  sudo /usr/sbin/postfix set-permissions
  sudo chmod 700 /Library/Server/Mail/Data/mat
  sudo /usr/sbin/postfix start

and with these changes, the system has sendmail running, and the gem works.

I can't think of a more completely wrong solution to the problem, but these guys aren't about the "right" answers - they're about the "magic" ones. They want to just drop a gem into a Gemfile, bundle it, and then have it do all the magic. They'll give it a bunch of configuration, and rather than question the use of such a gem, they'll just completely contort the project to the point that it fits the usage of the unappropriate gem.

It's bizarro programming.

I hate it. I really do.

UPDATE: Funny developments… the gem we're using is called Pony, and can use SMTP or sendmail, you just have to configure it differently. Also, it turns out that sendmail is the preferred way to send emails at The Shop. Kinda odd to me… to put sendmail on all hosts to be able to send simple emails, but OK… I'm a "team player", I'll back off. But what a waste of cycles - there are so much easier ways to do this same thing. Even if we keep the same gem.

Calculations are Flowing!

Friday, January 18th, 2013

Dark Magic Demand

Well, it's been a while getting here, but we finally have the chain reaction of calculations working in the demand service all the way up to, and including, the closed deal adjustments. I wrote quite a few tests on the closed deal adjustments because I'm working in a new language, and in order to make sure that it's working as I expect, I needed to test everything. First, in the unit tests of the taxonomy and price checks, as well as how to decompose the location data, and then in the second-level functions like closed deal option decomposition, and finally, in the top-level functions.

This is typically a ton more testing than I normally write, and in about six months, I won't be writing these tests, either. It's just that I'm really not at all sure about this clojure code, and in order to make myself feel more comfortable about it, I needed the tests. And the REPL.

But to see the logs emit the messages I expected, and the ones I didn't, was a joy to behold. This was a long time in coming, and it's been a rough and bumpy road, but it's getting a little smoother, and with this milestone, next week should prove to be a great step forward for the project.

I'm looking forward to it.

Pulled Additional Fields from Salesforce for Demand Adjustment

Thursday, January 17th, 2013

Salesforce.com

This afternoon I realized that we really need to have a few additional fields about the closed deals for the demand adjustment. Specifically, we have no idea of the start date for the deal, and while we have the close_date, that's not much use to us if it's empty, as many are until they have an idea of when they really want to shut down the deal. Additionally, one of the sales reps pointed out that there are projected sales figures on each 'option' in a deal, and rather than look at the total projected sales and divide it up, as we have in the past, we should be looking for those individual projected sales figures and using them - if no sales have been made.

Seems reasonable, so I added those to the Salesforce APEX class, and ran it to make sure it was all OK. There were no code changes to the ruby code because we had (smartly) left it as a hash, and so additional fields aren't going to mess things up… but in out code we can now take advantage of them.

Surprisingly time-consuming because I had to drop tables, and add properties and get things in line - but that's what happens when you add fields to a schema… you have to mess with it. Still, it's better than using a document database for this stuff. Relational with a simple structure beats document every time.

Clojure Training Class with Aaron

Thursday, January 17th, 2013

Clojure.jpg

Today has been an all-day training session at The Shop with Aaron B. - the Security Lead at Groupon and one of the maintainers of Clojure for a while. Very interesting to see his take on things, and I have to say, it's far more refreshing than what I'm used to seeing from the clojure crew closer to me. For instance, Aaron sees a time and a place for OO code and functional code. He also sees that while multithreaded code is hard, there are lots of people that are very good at it - but most aren't.

His take on a lot of the things in the language was nice, as my current tutor is not really giving me a lot other than a very mathematical bent on the situation, and that's not mapping into my experience as nicely as I might like. It's really pretty bad, actually. But it's getting better, and the more I work with it, the better I'm getting, and that makes things a lot easier.

I'm guessing that in about six months, things will be pretty much settled out, and I'll be able to just hit up Aaron now and again for performance advice, or how things work under the hood. WHen that time comes, I'll be a lot happier using clojure in production systems, but for now, it's still pretty hard.

Gotta keep working at it.

Finally Finished Up Closed Deal Adjustments

Wednesday, January 16th, 2013

Dark Magic Demand

Today I was finally able to get the first good cut at the closed deal adjustment feature in the demand service we've been working on. The basic concept is that anything that's been closed by a sales rep since the delivery of the demand has to be subtracted from the demand as it can already be considered "fulfilled". This is already being done in the mainline ruby app, but the goal with this demand service is to get it out of the ruby app and into the demand service so that all the pulling and adjusting can be done there, and then the mainline ruby app doesn't have to waste the time doing it.

It's a good idea, and it'll save us between 20% and 30% in the runtime, and as we scale to the global markets, that's going to be very nice to have. But it's not easy taking mutable, referenced, ruby objects and make it all immutable and functional. But today I think I have it all done. Well… at least ready for testing. That's tomorrow...