Archive for the ‘Clojure Coding’ Category

Parsing JSON with wrap-json-body

Wednesday, March 19th, 2025

Clojure.jpg

I have been making good progress on the Clojure app, and then got to handling a PUT call, and realized that the current scheme I was using for parsing the :body of the compojure request really was not what I wanted. After all, the Content-type is set to application/json in the call, so there should be a way for the ring middleware to detect this, and parse the JSON body so that the :body of the request is a Clojure map, and not something that has to be parsed.

So I went digging...

As it turns out, there is a set of ring middleware for JSON parsing, and the middleware I needed was already written: wrap-json-body and to use it really is quite simple:

  (:require [camel-snake-kebab.core :refer [->kebab-case-keyword]]
            [ring.middleware.json :refer [wrap-json-body]])
 
  (def app
    "The actual ring handler that is run -- this is the routes above
     wrapped in various middlewares."
    (let [backend (session-backend {:unauthorized-handler unauthorized-handler})]
      (-> app-routes
          (wrap-access-rules {:rules rules :on-error unauthorized-handler})
          (wrap-authorization backend)
          (wrap-authentication backend)
          wrap-user
          (wrap-json-body {:key-fn ->kebab-case-keyword})
          wrap-json-with-padding
          wrap-cors
          (wrap-defaults site-defaults*)
          wrap-logging
          wrap-gzip)))

where the key middleware is:

          (wrap-json-body {:key-fn ->kebab-case-keyword})

and it takes the ->kebab-case-keyword function to apply to each of the keys of the parsed JSON, and for me, that makes them keywords, and kebab-cased. This means I only have to have the right spelling in the client code, and I don't care a whit about the casing - very nice. 🙂

With this, an element of a defroutes can look like:

    (POST "/login" [:as {session :session body :body}]
      (do-login session body))

and the body will be parsed JSON with keywords for keys, and the kebab case. You can't get much nicer than that. Clean, simple, code. Beautiful.

Writing buddy-auth Authorization Handlers

Friday, March 14th, 2025

Clojure.jpg

As I'm building out the core functionality of my current application, the next thing I really wanted to add were a few Authorization handlers for the buddy-auth system I started using with WebAuthN Authentication. The WebAuthN started with a simple Is this user logged in? handler:

  (defn is-authenticated?
    "Function to check the provided request data for a ':user' key, and if it's
    there, then we can assume that the user is valid, and authenticated with the
    passkey. This :user is on the request because the wrap-user middleware put
    it there based on the :session data containing the :identity element and we
    looked up the user from that."
    [{user :user :as req}]
    (uuid? (:id user)))

In order for this to work properly, we needed to make a wrap-user middleware so that if we had a logged in user in the session data, then we would place it in the request for compojure to pass along to all the other middleware, and the routes themselves. This wasn't too hard:

  (defn wrap-user
    "This middleware is for looking at the :identity in the session data, and
    picking up the complete user from their :email and placing it on the request
    as :user so that it can be used by all the other endpoints in the system."
    [handler]
    (fn [{session :session :as req}]
      (handler (assoc req :user (get-user (:email (:identity session)))))))

and this middleware used a function, get-user to load the complete user object from the database based on the email of the user. It's not all that hard, but there are some tricks about the persistence of the Passkey Authenticator object that have to be serialized, and I've already written about that a bit.

And this works perfectly because the WebAuthN workflow deposits the :identity data in the session, and since it's stored server-side, it's safe, and with ring session state persisted in redis, we have this survive restarts, and shared amongst instances in a load balancer. But what about something a little more specific? Like, say we have an endpoint that returns the details of an order, but only if the user has been permission to see the order?

This combines Roll-Based Access Control (RBAC), and Attribute-Based Access Control (ABAC) - and while some will say you only need one, that's really not the best way to build a system because there are times when you need some of both to make the solution as simple as possible.

In any case, this is what we need to add:

  • For a user, can they actually see the order in the database? This can be a question of the schema and data model, but there will likely be a way to determine if the user was associated with the order, and if so, then they can see it, and if not, then they can't.
  • Most endpoint conventions have the identifier as the last part of the URL - the Path, as it is referred to. We will need to be able to easily extract the Path from the URL, or URI, in the request, and then use that as the identifier of the order.
  • Put these into an authentication handler for buddy-auth.

For the first, I made a simple function to see if the user can see the order:

  (defn get-user-order
    "Function to take a user id and order id and return the user/order
    info, if any exists, for this user and this order. We then need to
    look up the user-order, and return the appropriate map - if one exists."
    [uid oid]
    (if (and (uuid? uid) (uuid? oid))
      (db/query ["select * from users_orders
                   where user_id = ? and order_id = ?" uid oid]
        :row-fn kebab-keys-deep :result-set-fn first)))

For the second, we can simply look at the :uri in the request, and split it up on the /, and then take the last one:

  (defn uri-path
    "When dealing with Buddy Authentication handlers, it's often very useful
    to be able to get the 'path' from the request's uri and return it. The
    'path' is defined to be:
       https://google.com/route/to/path
    and is the last part of the url *before* the query params. This is very
    often a uuid of an object that we need to get, as it's the one being
    requested by the caller."
    [{uri :uri :as req}]
    (if (not-empty uri)
      (last (split uri "/"))))

For the last part, we put these together, and we have a buddy-auth authorization handler:

  (defn can-see-order?
    "Function to take a request, and pull out the :user from the wrapping
    middleware, and pick the last part of the :uri as that will be the
    :order-id from the URL. We then need to look up the user-order, and
    see if this user can see this order."
    [{user :user :as req}]
    (if-let [hit (get-user-order (:id user) (->uuid (uri-path req)))]
      (not-nil? (some #{"OPERATOR"} (:roles hit)))))

in this function we see that we are referring to :roles on the user-order, and that's because we have built up the cross-reference table in the database to look like:

  CREATE TABLE IF NOT EXISTS users_orders (
    id              uuid NOT NULL,
    version         INTEGER NOT NULL,
    as_of           TIMESTAMP WITH TIME zone NOT NULL,
    by_user         uuid,
    user_id         uuid NOT NULL,
    roles           jsonb NOT NULL DEFAULT '[]'::jsonb,
    title           VARCHAR,
    description     VARCHAR,
    order_id        uuid NOT NULL,
    created_at      TIMESTAMP WITH TIME zone NOT NULL,
    PRIMARY KEY (id, version, as_of)
  );

The key parts are the user_id and order_id - the mapping is many-to-many, so we have to have a cross-reference table to handle that association. Along with these, we have some metadata about the reference: the title of the User with regard to this order, the description of the relationship, and even the roles the User will have with regards to this order.

The convention we have set up is that of the roles contains the string OPERATOR, then they can see the order. The Postgres JSONB field is ideal for this as it allows for a simple array of strings, and it fits right into the data model.

With all this, we can then make a buddy-auth access rule that looks like:

   {:pattern #"^/orders/[-0-9a-fA-F]{36}"
    :request-method :get
    :handler {:and [is-authenticated? can-see-order?]}}

and the endpoints that match that pattern will have to pass both the handlers and we have exactly what we wanted without having to place any code in the actual routes or functions to handle the authorization. Nice. 🙂

Working with java.time in Clojure

Tuesday, March 11th, 2025

java-logo-thumb.png

I've been working on a new Clojure project, and since I last did production Clojure work, the Joda Time library has been deprecated, and the move has been to the Java 8 java.time classes. The functionality is basically the same, but the conversion isn't, and one of the issues is that the JDBC Postgres library will return Date and Timestamp objects - all based on java.util.Date.

As it turns out, the conversion isn't as easy as I might have hoped. 🙂

For the most part, it's just a simple matter of using different functions, but the capabilities are all there in the Clojure clojure.java-time library. The one key is the conversion with Postgres. There, we have the protocol set up to help with conversions:

  (:require [java-time.api :as jt])
 
  (extend-protocol IResultSetReadColumn
    PGobject
    (result-set-read-column [pgobj metadata idx]
      (let [type  (.getType pgobj)
            value (.getValue pgobj)]
        (case type
          "json" (json/parse-string-strict value true)
          "jsonb" (json/parse-string-strict value true)
          value)))
 
    java.sql.Timestamp
    (result-set-read-column [ts _ _]
      (jt/zoned-date-time (.toLocalDateTime ts) (jt/zone-id)))
 
    java.sql.Date
    (result-set-read-column [ts _ _]
      (.toLocalDate ts)))

and the key features are the last two. These are the conversions of the SQL Timestamp into java.time.ZonedDateTime and Date into java.time.LocalDate values.

As it turns out, the SQL values have Local date, and time/date accessors, and so converting to a Zoned timestamp, just means picking a convenient zone, as the
offset is carried in the LocalDateTime already. Using the system default is as
good as any, and keeps things nicely consistent.

With these additions, the data coming from Postgres 16 timestamp and date columns is properly massaged into something that can be used in Clojure with the rest of the clojure.java-time library. Very nice!

UPDATE: Oh, I missed a few things, so let's get it all cleared up here now. The protocol extensions, above, are great for reading out of the Postgres database. But what about inserting values into the Postgres database? This needs a slightly different protocol to be extended:

  (defn value-to-jsonb-pgobject
    "Function to take a _complex_ clojure data element and convert it into
    JSONB for inserting into postgresql 9.4+. This is the core of the mapping
    **into** the postgres database."
    [value]
    (doto (PGobject.)
          (.setType "jsonb")
          (.setValue (json/generate-string value))))
 
  (extend-protocol ISQLValue
    clojure.lang.IPersistentMap
    (sql-value [value] (value-to-jsonb-pgobject value))
 
    clojure.lang.IPersistentVector
    (sql-value [value] (value-to-jsonb-pgobject value))
 
    clojure.lang.IPersistentList
    (sql-value [value] (value-to-jsonb-pgobject value))
 
    flatland.ordered.map.OrderedMap
    (sql-value [value] (value-to-jsonb-pgobject value))
 
    clojure.lang.LazySeq
    (sql-value [value] (value-to-jsonb-pgobject value))
 
    java.time.ZonedDateTime
    (sql-value [value] (jt/format :iso-offset-date-time value))
 
    java.time.LocalDate
    (sql-value [value] (jt/format :iso-local-date value)))

basically, we need to tell the Clojure JDBC code how to map the objects, Java or Clojure, into the SQL values that the JDBC driver is expecting. In the case of the date and timestamp, that's not too bad as Postgres will cast from strings to the proper values for the right formats.

But there remains a third set of key values - the Parameters to PreparedStatement objects. This is key as well, and they need to be SQL objects, but here the casting isn't done by Postgres as it is in the JDBC Driver, and that needs proper Java SQL objects. For this, we need to add:

  (extend-protocol ISQLParameter
    java.time.ZonedDateTime
    (set-parameter [value ^PreparedStatement stmt idx]
      (.setTimestamp stmt idx (jt/instant->sql-timestamp (jt/instant value))))
 
    java.time.LocalDate
    (set-parameter [value ^PreparedStatement stmt idx]
      (.setDate stmt idx (jt/sql-date value))))

Here, the Clojure java-time library handles the date easily enough, and I just need to take the ZonedDateTime into a java.time.Instant, and then the library again takes it from there.

These last two bits are very important for the full-featured use of the new Java Time objects and Postgres SQL. But it's very worth it.

Persisting Java Objs within Clojure

Thursday, March 6th, 2025

Clojure.jpg

For the last day or so I've been wrestling with a problem using WebAuthN on a Clojure web app based on compojure and ring. There were a few helpful posts that got be in the right direction, and using Buddy helps, as it handles a lot of the route handling, but getting the actual WebAuthN handshake going was a bit of a pain.

The problem was that after the Registration step, you end up with a Java Object, an instance of com.webauthn4j.authenticator.AuthenticatorImpl and there is no simple way to serialize it out for storage in a database, so it was time to get creative.

I did a lot of digging, and I was able to find a nice way to deserialize the object, and return a JSON object, but there was no way to reconstitute it into an AuthenticatorImpl, so that had to be scrapped.

Then I found a reference to an Apache Commons lang object that supposedly was exactly what I wanted... it would serialize to a byte[], and then deserialize from that byte[] into the object. Sounds good... but I needed to save it in a Postgres database. Fair enough... let's Base64 encode it into a string, and then decode it on the way out.

The two key functions are very simple:

  (:import org.apache.commons.lang3.SerializationUtils
           java.util.Base64)
 
  (def not-nil? (complement nil?))
 
  (defn obj->b64s
    "This is a very useful function for odd Java Objects as it is an Apache tool
    to serialize the Object into a byte[], and then convert that into a Base64
    string. This is going to be very helpful with the persistence of objects to
    the database, as for some of the WebAuthN objects, it's important to save
    them, as opposed to re-creating them each time."
    [o]
    (if (not-nil? o)
      (.encodeToString (Base64/getEncoder) (SerializationUtils/serialize o))))
 
  (defn b64s->obj
    "This is a very useful function for odd Java Objects as it is an Apache tool
    to deserialize a byte[] into the original Object that was serialized with
    obj->b64s. This is going to be very helpful with the persistence of objects
    to the database, as for some of the WebAuthN objects, it's important to save
    them, as opposed to re-creating them each time."
    [s]
    (if (not-nil? s)
      (SerializationUtils/deserialize (.decode (Base64/getDecoder) s))))

These then fit into the saving and querying very simply, and it all works out just dandy. 🙂 I will admit, I was getting worried because I was about to regenerate the AuthenticatorImpl on each call, and that would have been a waste for sure.

The complete WebAuthN workflow is the point of another post, and a much longer one at that. But this really made all the difference.

Upgraded to Java 17.0.14 and 11.0.26

Thursday, February 20th, 2025

java-logo-thumb.png

I have been looking at starting some projects in Clojure for work, and I thought it would be good for me to get the latest JDK 17 from Homebrew and Temurin. As it turns out, the latest for JDK 17 is now JDK 17.0.4, and since I had a slightly older version of that version, and the Homebrew name changed, I had to:

  $ brew tap homebrew/cask

and then to actually update it:

  $ brew install --cask temurin@17

When I checked:

  $ java -version
  openjdk version "17.0.14" 2025-01-21
  OpenJDK Runtime Environment Temurin-17.0.14+7 (build 17.0.14+7)
  OpenJDK 64-Bit Server VM Temurin-17.0.14+7 (build 17.0.14+7, mixed mode, sharing)

which is exactly what I was hoping for.

As an interesting note, the re-tapping of the cask updated the name of temurin11 to temurin@11, and I updated JDK 11 as well - why not? It's at 11.0.26, and I might use it... you never know.

Now I'm up-to-date with both versions, and I can easily switch from one to the other with the shell function I wrote. Excellent! 🙂

Advent of Code 2022 is On!

Thursday, December 1st, 2022

Christmas Tree

This morning I did the first day of the 2022 Advent of Code. What fun it is to get back into Clojure - for a month. If I thought it the least bit reasonable, I'd be doing back-end Clojurescript as it gets generated into Javascript in the same way that Typescript does, so it would run on the same stack with the same speed, etc. But it's just too big a leap for most folks, and it's not worth the education cycles.

But still... the simplicity of the language, and it's ability to run in highly multi-threaded environments is a huge win, and so it will remain one of my very favorite languages.

Temurin 11 is on Homebrew

Tuesday, May 24th, 2022

Homebrew

This morning I decided to see if I could get the AdoptOpenJDK 11, now called Temurin 11, going on my M1Max MacBook Pro. In the past, they have had the AdoptOpenJDK 11 on Homebrew, but it was Intel, and I didn't want to bring in Rosetta 2 for any Clojure work, so I was willing to wait. I read on Twitter from the Eclipse Foundation that they had placed Java 11 and 8 for Apple Silicon on Homebrew, so why not?

It turns out, it's not bad at all:

  $ brew tap homebrew/cask-versions
  $ brea install --cask temurin11

and it's ready to go. Then I can use my setjdk function to switch between JDK 11 and 17 on my laptop, which is nice when there are some issues I've had to deal with in the past. Don't know when I'll need this again.

Sadly, the architecture for JDK 8 is still Intel:

  $ brew info temurin8
  temurin8: 8,332,09
  https://adoptium.net/
  Not installed
  From: https://github.com/Homebrew/homebrew-cask-versions/blob/HEAD/Casks/temurin8.rb
  ==> Name
  Eclipse Temurin 8
  ==> Description
  JDK from the Eclipse Foundation (Adoptium)
  ==> Artifacts
  OpenJDK8U-jdk_x64_mac_hotspot_8u332b09.pkg (Pkg)

that last line is the kicker: x64... so it goes. Still... I have JDK 11 for Apple Silicon now, that's good. 🙂

Merry Christmas!

Saturday, December 25th, 2021

Christmas Tree

It's another Christmas, and things have started out quietly... I was able to finish Advent of Code today - all 25 Days in 25 days. It's an unusually warm Christmas - so no snow, but that's not really unusual, but the temps being as warm as they are is a little odd. Still... the temps will fall, and the snow will come, and it'll be time to shovel snow, but for now it's just like an extended Fall.

The Christmas Music has been great this year, and I've enjoyed some really good Christmas Movies. I wrote a Christmas Letter yesterday and sent it to extended family, and it seemed to be appreciated. I like catching folks up with a little humor. 🙂

Life goes on.

The iPad Pro Really is Something

Wednesday, December 22nd, 2021

IPadPro

I have been using the iPad Pro for two generations now, and my current model is the M1 iPad Pro, and during the pandemic, it has really proven to be a great Zoom, Meet, etc. box. It has a nice camera, and with Front and Center in iPadOS, it really makes it easy to have a good presentation, or meeting. But it's really so much more.

With the GitHub app, I can get PR notifications, review them, and merge them. With the GitHub Workflow Actions, we have continuous deployment, and that is really quite amazing to me. Of course there are the shells to boxes, and that is great, but even offline, there is so much to like about this machine.

It's rugged - compared to my MacBook Pro, and the screen is a lot easier to clean. I'm not saying my MacBook Pro isn't nice... it's just that in some respects, the iPad Pro is nicer.

What a really amazing device. 🙂

The Nasty Log4j Business

Monday, December 20th, 2021

log4j.jpg

It's been a wild couple of weeks for the log4j team... I mean, the problem with a logger is that you don't really want to limit it, and adding the url handlers probably seemed like a great idea at the time, but once they started to be used, it was understandably hard to drop support for them. And then the exploit hit.

It's just one of those nearly universal components of JVM systems that is being supported by volunteers, and trying to thread the needle between keeping as much of the functionality as they can... while restricting the vulnerability to something that can't be exploited. It's clearly not easy, as they've had at least three releases of the 2.x codebase to try and correct the vulnerability, and each time, there seems to be more there is to do.

This is certainly going to shift how some open source teams function... it's great to be the author, or maintainer of something as used as log4j, but to have this kind of attention... well... I'm sure it's not what they were hoping for this Christmas. 🙂