Upgraded to Postgres 16.7

February 20th, 2025

PostgreSQL.jpg

I noticed that Postgres 16 was updated in Homebrew, so I took the time to upgrade the installation on my MacBook Pro, and it was remarkably easy. Because it was a simple "dot" upgrade, the installer did all the heavy lifting:

 $ brew upgrade postgresql@16

and then when it was done, simply restart the service with:

 $ brew services restart postgresql@16

And that was it. Everything is up and running just fine. What a treat. 🙂

This is the second "dot" upgrade I've done with Postgres 16 and Homebrew, and I just can't get over how clean and simple it is. I've checked all the databases, and they are good, and everything is fine.

Upgraded to Java 17.0.14 and 11.0.26

February 20th, 2025

java-logo-thumb.png

I have been looking at starting some projects in Clojure for work, and I thought it would be good for me to get the latest JDK 17 from Homebrew and Temurin. As it turns out, the latest for JDK 17 is now JDK 17.0.4, and since I had a slightly older version of that version, and the Homebrew name changed, I had to:

  $ brew tap homebrew/cask

and then to actually update it:

  $ brew install --cask temurin@17

When I checked:

  $ java -version
  openjdk version "17.0.14" 2025-01-21
  OpenJDK Runtime Environment Temurin-17.0.14+7 (build 17.0.14+7)
  OpenJDK 64-Bit Server VM Temurin-17.0.14+7 (build 17.0.14+7, mixed mode, sharing)

which is exactly what I was hoping for.

As an interesting note, the re-tapping of the cask updated the name of temurin11 to temurin@11, and I updated JDK 11 as well - why not? It's at 11.0.26, and I might use it... you never know.

Now I'm up-to-date with both versions, and I can easily switch from one to the other with the shell function I wrote. Excellent! 🙂

Fire off a Promise

April 19th, 2024

This morning I was thinking about the interrogation of the Promise in Node, and it was a little bothering to me that there was no way to easily see if it was complete. Then I thought - Maybe there is a different way? So I decided to give this a try.

The use-case is this: I want to be able to fire off a request to another Service, and maybe it'll get back to me in time, and maybe it won't. Either way, I don't want to wait. I have other work to do that must get done. So I want to fire this Promise off, and then, if it's done when I need its result - Great! If not, then Also Great!

But the normal:

  const foo = await callAsyncFunction()

isn't going to work, because that will wait until it's done. So how to work this out?

It turns out that it's not too hard.

  const { setTimeout } = require('timers/promises')
 
  const runTest = async () => {
    let done = false
    const foo = setTimeout(5000).then((val) => done = true)
    for (let i = 0; i < 10; i++) {
      console.log('FOO', foo, done)
      await setTimeout(1000)
    }
  }
 
  runTest()
    .then(() => console.log('All done running Timing test.'))
    .catch(console.error)
    .finally(() => process.exit())

and when I run this:

  $ node foo.js
  FOO Promise { <pending> } false
  FOO Promise { <pending> } false
  FOO Promise { <pending> } false
  FOO Promise { <pending> } false
  FOO Promise { <pending> } false
  FOO Promise { true } true
  FOO Promise { true } true
  FOO Promise { true } true
  FOO Promise { true } true
  FOO Promise { true } true
  All done running Timing test.

So all we need to do is to have a variable that indicates the state of the completeness, and then return that in the .then() call. Sure, it may make a lot more sense to have:

    const foo = setTimeout(5000).then((val) => {
      done = true
      return val
    })

so that we get the value back into foo, but that's easy... the point is to toggle the variable in that .then() and query that, as needed.

This way, I don't have to worry about any unsupported ways of finding out, it's simple. 🙂

Upgraded to Postgres 16.2

March 26th, 2024

PostgreSQL.jpg

I noticed that Postgres 16 was updated in Homebrew, so I took the time to upgrade the installation on my MacBook Pro, and it was remarkably easy. Because it was a simple "dot" upgrade, the installer did all the heavy lifting:

 $ brew upgrade postgresql@16

and then when it was done, simply restart the service with:

 $ brew services restart postgresql@16

And that was it. Everything is up and running just fine. What a treat. 🙂

Installing New TV

January 27th, 2024

TV.jpg

This morning it was time to see about installing the new TV I ordered and was delivered this week. It's a nice Sony, with good reviews, and it should last me another 10 yrs like the last one, but as with the last one, the installation of a 75" TV by myself is going to be a little bit of a challenge.

That's why I didn't try to install it when it arrived on Wednesday. Better to wait until I had loads of time, like today, to unbox it, get it all ready to go, and if I can't get it in myself, then my daughter is coming over later to help. We will see... keep a positive outlook, and don't push it... it's not small, that's for sure.

Shucks – the TV Went Out

January 6th, 2024

TV.jpg

This morning I turned on the TV in the living room and there was a dark region about 40% of the screen right in the middle of the display. Now this has been a good TV for me for about 10 years, no problems at all, but this kind of thing usually means replacement, because at 70", the replacement cost is about the same as the repair cost.

I'll give it a few days, but if it doesn't clear up with some power-cycling and cable checks, then it's a goner, and I need to be looking for a replacement.

Happy Birthday to Me!

December 31st, 2023

Cake.jpg

It's been another trip around the sun for me, and so I guess it's time to look back at the year, see what's gone well, see what I learned when things didn't go so well, and try to learn from everything that happened this year. I have to say, it's been a very good year in many respects, and I've learned a lot about dealing with situations that I find myself in, that I wish I weren't.

I will admit that I'm a touch surprised that there isn't more snow, or any to speak of, but that will come soon enough, and for now, it's just nice to enjoy the weather we're having.

I am very lucky... I have all that I need, and most of what I want. 🙂

Merry Christmas!

December 25th, 2023

Christmas Tree

It's been a nice, quiet Christmas here, and that's about all I could possibly ask for. I've treated it much like a normal Monday, with a morning run, and then crossword, and checking what's happening at work.

No major issues for work today, but a few little things that needed to be done. Not bad at all... 🙂

Postgres JSONB Field Manipulations

December 4th, 2023

PostgreSQL.jpg

Since it was announced, the JSONB field in Postgres has been one of the most useful fields I've encountered. It's the ability to have complex data as a single field, with very powerful querying tools into the data - as needed. But one thing that has always been a little tough for me is the manipulation of the data, in SQL, so that it stays as JSON, but it's been altered from it's as-stored value.

Let's say we have a table that has a JSONB field that has an array of Objects, each looking something like this:

  {
    "id": 101,
    "size": "large",
    "score": 91,
    "count": 1232
  }

and there can be more, but what you really want in your SELECT is just an Array of the id and score, and nothing else. It still needs to be an Array, and each element needs to be an Object, but a smaller Object, with just those two attributes.

From the docs, Postgres has jsonb_array_elements() that can turn an Array into something that looks like a table... and we can use the ->> notation to get at the value of an attribute, but how does it all fit together to disassemble, modify, and re-assemble all the elements? A sub-select.

Let's walk through one.

  SELECT jsonb_agg(
           jsonb_build_object(
             'id', t.value->>'id',
             'score', t.value->>'score'
           )
         )
    FROM jsonb_array_elements(c.targets) t

the json_build_object() will take the key/value pairs it's given, and create a new Object. The source data for this is t and is the output of the jsonb_array_elements() function in the targets field, which is the JSONB file that is holding this data.

Then to pull this together, you might have a query like:

  SELECT c.id, c.name,
         (SELECT jsonb_agg(jsonb_build_object(
                   'id', t.value->>'id',
                   'score', t.value->>'score'))
            FROM jsonb_array_elements(c.targets) t
           WHERE t.id = c.id) AS targets
    FROM company c
   WHERE ...

It's not easy... but it's powerful, and there are a lot of things that can be done with this kind of manipulation... it's just going to take a little practice. 🙂

Ordered an M3 Max MacBook Pro

November 5th, 2023

Black MacBook Pro

I thought about it for a bit, but the boost in performance, and the doubling of the memory were really the key points that made it a decision worth making. So I ordered it. A new M3 Max MacBook Pro - Black, 128GB RAM, and 2TB storage (seems silly to call it a disk, or drive anymore). And the target delivery date is around the end of the month. Not bad.

After watching the Scary Fast announcement, it was clear that skipping the M2 Max was going to make this jump a much more significant one, and I was right. Also, there's nothing really wrong with my existing M1 Max MacBook Pro, but the doubling of memory, and 50% increase in speed is going to be something I will use every single day.

The new Space Black color looks good, and I'm sure it'll be just fine with regards to the fingerprints mentioned in so many reviews, and it'll be nice to see it next to my iPad Pro that's similarly dark... it should be a nice addition. 🙂