Merry Christmas!

December 25th, 2023

Christmas Tree

It's been a nice, quiet Christmas here, and that's about all I could possibly ask for. I've treated it much like a normal Monday, with a morning run, and then crossword, and checking what's happening at work.

No major issues for work today, but a few little things that needed to be done. Not bad at all... 🙂

Postgres JSONB Field Manipulations

December 4th, 2023

PostgreSQL.jpg

Since it was announced, the JSONB field in Postgres has been one of the most useful fields I've encountered. It's the ability to have complex data as a single field, with very powerful querying tools into the data - as needed. But one thing that has always been a little tough for me is the manipulation of the data, in SQL, so that it stays as JSON, but it's been altered from it's as-stored value.

Let's say we have a table that has a JSONB field that has an array of Objects, each looking something like this:

  {
    "id": 101,
    "size": "large",
    "score": 91,
    "count": 1232
  }

and there can be more, but what you really want in your SELECT is just an Array of the id and score, and nothing else. It still needs to be an Array, and each element needs to be an Object, but a smaller Object, with just those two attributes.

From the docs, Postgres has jsonb_array_elements() that can turn an Array into something that looks like a table... and we can use the ->> notation to get at the value of an attribute, but how does it all fit together to disassemble, modify, and re-assemble all the elements? A sub-select.

Let's walk through one.

  SELECT jsonb_agg(
           jsonb_build_object(
             'id', t.value->>'id',
             'score', t.value->>'score'
           )
         )
    FROM jsonb_array_elements(c.targets) t

the json_build_object() will take the key/value pairs it's given, and create a new Object. The source data for this is t and is the output of the jsonb_array_elements() function in the targets field, which is the JSONB file that is holding this data.

Then to pull this together, you might have a query like:

  SELECT c.id, c.name,
         (SELECT jsonb_agg(jsonb_build_object(
                   'id', t.value->>'id',
                   'score', t.value->>'score'))
            FROM jsonb_array_elements(c.targets) t
           WHERE t.id = c.id) AS targets
    FROM company c
   WHERE ...

It's not easy... but it's powerful, and there are a lot of things that can be done with this kind of manipulation... it's just going to take a little practice. 🙂

Ordered an M3 Max MacBook Pro

November 5th, 2023

Black MacBook Pro

I thought about it for a bit, but the boost in performance, and the doubling of the memory were really the key points that made it a decision worth making. So I ordered it. A new M3 Max MacBook Pro - Black, 128GB RAM, and 2TB storage (seems silly to call it a disk, or drive anymore). And the target delivery date is around the end of the month. Not bad.

After watching the Scary Fast announcement, it was clear that skipping the M2 Max was going to make this jump a much more significant one, and I was right. Also, there's nothing really wrong with my existing M1 Max MacBook Pro, but the doubling of memory, and 50% increase in speed is going to be something I will use every single day.

The new Space Black color looks good, and I'm sure it'll be just fine with regards to the fingerprints mentioned in so many reviews, and it'll be nice to see it next to my iPad Pro that's similarly dark... it should be a nice addition. 🙂

macOS 14.1 Update Fixed WebKit Issue

October 27th, 2023

Yosemite

This morning I'm very happy to see that the issue I've been having with macOS Sonoma 14.0 appears to be gone. I like the upgrade, for the most part, but what I'd noticed was that the memory usage for Safari and Safari Technology Preview would rise to the point of crashing the system. This mean that I had to have Activity Monitor running all the time to make sure the web pages that were getting too big were reloaded, or dropped, when their footprint got the memory pressure into the "yellow" - before it went "red".

I had expected that this was a dot 0 issue, and I was right - with the 14.1 update earlier this week, the memory footprint has started low, and stayed there - for a few days. Now I'll probably run Activity Monitor through the weekend, just to make sure, but I have a good feeling that this is something that got cleared up, and I'm not going to see a recurrence of the problem.

I have enjoyed macOS, and the Mac System long before Cocoa and Foundation, but this is something I am glad to see I was right about. They move forward, but pay attention, and fix the little things as they go. What a great team. 🙂

Moving to Postgres 16.0

October 20th, 2023

PostgreSQL.jpg

This morning I noticed that not only was Postgres 14.9 out, they had released 15.x and even 16.0. It's unusual for me to be a full major version behind on my main laptop, but to be two was just something that had to be corrected.

Several months ago, it was clear that the official Postgres builds were no longer being done by the Postgres group, and so the support for 15.0 wasn't in Homebrew. I figured it'd just be a little bit, and then things would start back up again. But that was not the case. What happened, instead, was that the Homebrew volunteers took it upon themselves to build the packages for 14.x, 15, and now 16.

So let's write this all down so it's easy to do next time we change a major version of Postgres. Start by saving everything in all the databases:

  $ pg_dumpall > dump.sql
  $ brew services stop postgresql@14

Now we can wipe out the old install and it's data:

  $ brew uninstall postgresql@14
  $ rm -rf /opt/homebrew/var/postgresql@14

Now we install the new version, start it, and load back up the data:

  $ brew install postgresql@16
  $ brew services start postgresql@16
  $ psql -d postgres -f dump.sql
  $ psql -l

If the command psql doesn't show up in the path, just relink the package:

  $ brew link postgresql@16

Then it should be in the right spot.

At this point, it's all loaded up and you can ditch the dump.sql file, as it's no longer needed, and the new version is active:

  $ psql --version
  psql (PostgreSQL) 16.0 (Homebrew)

Not bad at all. 🙂

Nice Postgres Feature: LATERAL

September 12th, 2023

PostgreSQL.jpg

There are many times when you would like a sub-select query to be constrained on one of the values of the main query, but when you attempt to do that you get an error message about not being able to use the variable in this context. For example, this query:

  SELECT c.id, c.company_name, pb.available, pb.current,
         date_trunc('second', pb.as_of) AS as_of, pbs.*
    FROM companies c, plaid_tokens pt, plaid_balances pb,
         (SELECT SUM(available) AS all_available,
                 SUM(CURRENT) AS all_current
            FROM plaid_balances WHERE company_id=c.id) pbs
   WHERE pt.id = (c.additional_info->>'primaryPlaidAccount')::uuid
     AND pt.account_id = pb.account_id

where the goal is to have a sub-select gather the sum of the individual columns being pulled in the main query. It's a nice thing to have, but the inability to have c.id used in the sub-select really makes it difficult.

Postgres has a nice feature in LATERAL, where is allows the sub-select to reference these fields by changing the order of evaluation of the sub-select and doesn't penalize the performance too much.

  SELECT c.id, c.company_name, pb.available, pb.current,
         date_trunc('second', pb.as_of) AS as_of, pbs.*
    FROM companies c, plaid_tokens pt, plaid_balances pb,
         lateral (SELECT SUM(available) AS all_available,
                         SUM(CURRENT) AS all_current
                    FROM plaid_balances WHERE company_id=c.id) pbs
   WHERE pt.id = (c.additional_info->>'primaryPlaidAccount')::uuid
     AND pt.account_id = pb.account_id

This is still quick, and it saves the machinations of having to calculate the sums in a temp table, or write a function to do this... it's just a nice little trick that they put in the language. Very considerate. 🙂

The Passing of a Legend

August 6th, 2023

vim.jpg

This morning, a friend sent an email with a link mentioning the passing of Bram Moolenaar, the initial creator of Vim. There aren't many folks who have impacted my professional life as much as the creators of Vi, and then Bram and Vim.

I remember first using Vi at Purdue in my final year of Undergrad on the ADM terminals, and then making sure I could find termcap entries for all the terminals I could get my hands on in the days when you had terminals hooked by serial lines to the Dual VAX 11/780 system at the Electrical Engineering Department. After that, it was Auburn, and Vim on my Amiga, and then Mac System 6, and on virtually every system I had from 1985 on.

The only tool that even comples close to that longevity is GCC.

I know nothing lasts forever, and I know people pass on, but I honestly didn't expect to be so surprised by this news. I read it again, a couple of hours later on one of the RSS Feeds I read on my iPad, where, again, I have Vim. Still the same sense of sadness.

Ask not for whom the bell tolls...

Getting the Water Tested

July 10th, 2023

Government and Laws

This morning they dropped off the sample container for the Water Testing that Naperville is doing once a year. It's completely voluntary, and funded by the Town so that those of us that live in very old homes can be assured that the lead and other toxins in the water supply are will within the EPA limits.

I think it's been happening for about three years now, and I sign up for it each year because all I have to do is to take the sample, and then sit it on my doorstep, and I get a letter in a few weeks with the results of the tests. They have always been far below any threat levels of the EPA, so I'm not concerned, but I'll always accept a free environment test to make sure things are safe in my environment.

I do love this Town. 🙂

Visting with My Cousin Murry

July 8th, 2023

Path

Today I traveled to Indy to have a nice get-together with my siblings and our cousin Murry, on our Dad's side of the family. He's my age, and has spent almost all his adult life in Finance - running a Hedge Fund for a while before he rolled it up in 2022. He's now in Miami and doing a lot of traveling, and this stop in Indy was part of his "Touch Base with Family 2023" Tour.

A few months ago, I reached out to Murry and added him to my Sunday Messages list, where I send a little message about the family and how we're doing, and it's nothing special, but it keeps us in touch. I've been doing this with family and friends for a few years, and it's something I really feel is important.

Well... Murry is doing fine, and everyone had a good time. I really think this is good - the reconnections. Someday soon, there will be fewer of us...

Nice Postgresql Trick

June 29th, 2023

PostgreSQL.jpg

This morning I really wanted to be able to set the psql prompt in my sessions because the system we have for creating databases doesn't really create nicely human-readable name, and even so, I'd like to have the name in the prompt match the branch of the code I'm working on... it just works out better.

So I started digging, and the -c parameter is OK, but the psql session terminates after that - so that's not going to work. And piping in the \set commands seemed to be problematic, and then I found this:

  $ psql --set=PROMPT1="${br}%R%#%x "

where br is the name of the branch I'm working on. This could then be obtained from git easily, and then put into a function, and it works great!

  #
  # "platter psql (postgres command line tool)" - start a psql session on
  # the provided branch in the PLATTER_INSTANCE in the .env file in the
  # current directory (a Node project repo). If no branch is provided, then
  # the current git branch will be used.
  #
  function ppsql() {
    if [ -f .env ]; then
      set -o allexport; source .env; set +o allexport
      local br=$1
      if [ ${#br} -eq 0 ]; then
        br=`cat .git/HEAD | sed -e 's:^.*/::'`
      fi
      local url="`npx platter postgres branch url ${br} \
           --instance $PLATTER_INSTANCE | tr -d '\n'`?sslmode=require"
      psql --set=PROMPT1="${br}%R%#%x " --set=PROMPT2="${br}%R%#%x " $url
    else
      echo "Not in a Node project directory!"
    fi
  }

With this, it's so easy now to be able to keep track of the database (branch) I'm on with Platter, and that makes a really big different to my peace of mind. 🙂