I have spent far more time with this problem today than it deserves. One of the real annoyances with clojure is also a touted strength - the JVM. But that's where I'm having the problems today - in the ugly world of the java.util.Date and it's ilk. When I receive a JSON snippet like the following:
{
"division": "cleveland",
"start_date": "2012-12-07",
"end_date": "2013-01-07",
"demand" : [ … ]
}
and need to place it into a simple PostgreSQL table:
(:require [clojure.java.jdbc :as sql])
(sql/create_table "demand_sets"
[:id :uuid "PRIMARY KEY DEFAULT uuid_generate_v4()"]
[:division_id :uuid "references divisions (id)"]
[:source_id :uuid "references sources (id)"]
[:valid_from :timestamp]
[:valid_to :timestamp]
[:loaded_at :timestamp "DEFAULT now()"])
One would expect that we'd convert the string into a valid Java Timestamp with something like clj-time:
(defn get-demand-set
[division]
(let [demand-set (client/extract division)
start-date (:start_date demand_set)
end-date (:end_date demand_set)
demands (:demand demand_set)]
{:valid_from (to-date-time start-date)
:valid_to (to-date-time end-date)
:demand (map transform demands)}))
and, in fact, we get nice Java Date objects that are the correct time in the UTC (Zulu) time zone. This all makes sense.
Where it gets squirrley is the saving and reloading from a PostgreSQL database. The database table that was created in the above statement has the timezone in the fields, and there's a defined timezone for each instance of PostgreSQL - right there in the config file. I had my local database set to my local box - US/Central, and that was part of the problem. I wanted to see the dates in my database as I was seeing them in the JSON, and that's not realistic. I needed to understand that I'd really be looking at "2012-12-06 18:00:00" for the start-date, as that's the US/Central time of 2012-12-07 00:00:00 UTC.
Then I had to hassle with getting them out and comparing them correctly. This is really where this whole problem started - comparing dates where they formatted correctly, but were just off by a timezone or offset of some kind.
To do that, it turns out I needed to have something that would get me a single demand_set record:
(defn find-demand-set
[demand-set-id]
(select-1 demand-sets
(where {:id [= demand-set-id]})))
and here, we're using korma for the entity mapping and selecting. With this we can then get a demand set and extract out the Timestamp fields, but I need to make sure that I convert them properly:
(defn fetch-demand-set
[demand-set-id]
(let [demand-set-rec (db/find-demand-set demand-set-id)
;; …more here
]
{:valid_from (from-date (:valid_from demand-set-red))
:valid_to (from-date (:valid_to demand-set-rec))
;; …more here
}))
And then we can safely do something like this:
(defn demand-sets-equal?
[ds1 ds2]
(and (= (:valid_from ds1) (:valid_from ds2))
(= (:valid_to ds1) (:valid_to ds2))
;; …more stuff here
))
and the timestamps will properly compare.
But don't expect them to look like they did in the JSON unless you have the PostgreSQL databases set up on UTC time. Thankfully, that's what we have in the datacenter, so I don't have to worry about that. The only place I have to worry is on my laptop, and that's something I can live with for now.
There is one alternative I could have chosen. That's to have used the TIMESTAMP WITHOUT TIME ZONE field in the PostgreSQL table. I think that would have worked as well, and it would have shown the same value in all databases, regardless of the individual time zone setting. But then we would loose an important point in the time, and I wasn't ready to do that yet.
Still, to have spent several hours on this was exhausting