This morning I've finally finished up on something I've been working on and off for several days now - adding Closed Deals from Salesforce into the Demand Service that we're building in clojure. At some point, I'll probably drop the statement about what it's written in, but it's still too soon, as I think it was picked for all the wrong reasons. But here nor there this morning, it is the tool for now, and the future will bring what it brings.
The reason for needing the Closed Deals from Salesforce is that we are still totally dependent on Salesforce for holding all the actual bookable deals and merchant data. If we want to adjust the demand forecast by what's in inventory, then we need to get the data from Salesforce on at least a nightly basis, and use that data to update the demand forecast and "back it off" by the deals that have been closed since the demand was generated.
So if there's a demand forecast point of 1000 units, generated three days ago, if the sales reps closed a deal for 500 unit yesterday, we need to really only show the demand of 500 units today. The problem with all this is that Salesforce is not known for being exactly useful data and effective schemas. It's all there, but it's by no means easy to get to, or easy to use.
The first thing to do was to spend a day or two on just getting the data from Salesforce. Not as easy as I'd have hoped, as everything seems to be a REST interface - what… these people never hear of sockets? Anyway… I had a lot of grief with the paging that you have to do with Salesforce as it can't (won't) send you all the data at once. And it's not a size-limit thing, though they may advertise that as the reason, I've gotten "pages" with three small elements in them, so it's more than that, and for whatever reason, it's there and I have to deal with it.
I thought I had it all figured out, but I was slightly mistaken on the functionality of the take-while function in clojure. It seems that it continues as long as the value returned is "truthy" in some sense of the word. Meaning, it automatically stops on hitting a nil, but I made a function to test for that. Simple mistake, and it worked, but it wasn't the "clojure way", and when in Rome…
After I was able to get the data, I spent a couple of days just figuring out the PostgreSQL database schema so that we can load up the data easily and then get it out of the database as easily. We also need to make sure that we create the clojure entities for these tables, and that they are related to one another in the proper way. It's a usable, but manual ORM for clojure, and when in Rome…
With the schema working, I then had to try to load the data into the tables. This started out OK, but then as soon as I tried to read it back out, I ran into problems. They way the code is structured, we read out the potentially matching data, compare it to the next one, and then based on the results of that comparison, we either stop what we're doing (it's already there) or we insert the new data.
My code was failing on the pulling out the data, as the comparisons weren't working as planned. What I saw was a nice opportunity to chagne the logic a bit, so I did. I created a function that simply looked in the database to see if the deal I had, in hand, was already in the database. If so, it returned the ID of that deal. If not, it returned nil. This was really nice in that I don't care to read it all out and then compare it. I just want to know if it's already there!
This made things a lot nicer, and then things really started working. Very nice. No duplicates are loaded, but we can run this script for an historical two week period every day and be assured that we're missing nothing. Very sweet.
Took a while, but I learned a lot, and it's working well now.