Merry Christmas!

December 25th, 2022

Christmas Tree

It's another amazing Christmas, and all is quiet. Quiet times... noisy times... it's all a wonderful time of year to spend time with family and friends and enjoy the Season. It's time to enjoy a nice movie, or maybe a football game, and to have some special treats that you don't normally have - all to remember the traditions you grew up with.

Today I'll just be relaxing and enjoying the quiet. It's Sunday, so I'll send out my weekly texts to family and friends, and then sit back and enjoy the quiet of the day. It's just my favorite time of year. 🙂

Big Update Morning

December 14th, 2022

Yosemite

This morning has been a Big Update morning to be sure... iOS 16.2, tvOS 16.2, iPadOS 16.2, and macOS 13.1 all ready to go from Apple, and so everything I had went through the update. It's nice to get things up to date, not just because some of the things I've seen in Safari on macOS and iPadOS are iffy... but it's a chance to get some little surprises as well.

One thing that I've read about that I didn't really like is the new What's Up on tvOS... it used to be the things in my list, and now it's almost advertising for the things that some folks think I might be interested in. Not a huge fan, and would love to have a way to get back to the way it was. We'll see...

Until then, things are humming along, and it's nice to get it all done in an early morning. 🙂

Advent of Code 2022 is On!

December 1st, 2022

Christmas Tree

This morning I did the first day of the 2022 Advent of Code. What fun it is to get back into Clojure - for a month. If I thought it the least bit reasonable, I'd be doing back-end Clojurescript as it gets generated into Javascript in the same way that Typescript does, so it would run on the same stack with the same speed, etc. But it's just too big a leap for most folks, and it's not worth the education cycles.

But still... the simplicity of the language, and it's ability to run in highly multi-threaded environments is a huge win, and so it will remain one of my very favorite languages.

Node, Docker, Google Cloud, and Environment Variables

November 14th, 2022

GoogleCloud

At The Shop, we're using Google Cloud Run for a containerized API written in Node, and it's a fine solution - really. But one of the issues we have run into is that of environment variables. We have a lot of them. The configuration for dev versus prod versus local development is all being held in environment variables, and the standard way for these to be passed in the cloudbuild.yaml file in the Build step:


steps:
  - name: gcr.io/cloud-builders/docker
    entrypoint: '/bin/bash'
    args:
      - '-c'
      - >-
        docker build --no-cache
        --build-arg BRANCH_NAME=$BRANCH_NAME
        --build-arg THESHOP_ENV=$_THESHOP_ENV
        --build-arg BASE_API_URL=$_BASE_API_URL
        -t $_GCR_HOSTNAME/$PROJECT_ID/$REPO_NAME/$_SERVICE_NAME:$COMMIT_SHA
        . -f Dockerfile
    id: Build

and then in the Dockerfile, you have:

ARG BRANCH_NAME
RUN test -n "$BRANCH_NAME" || (echo 'please pass in --build-arg BRANCH_NAME' && exit 1)
ENV BRANCH_NAME=${BRANCH_NAME}
 
ARG THESHOP_ENV
RUN test -n "$THESHOP_ENV" || (echo 'please pass in --build-arg THESHOP_ENV' && exit 1)
ENV THESHOP_ENV=${THESHOP_ENV}
 
ARG BASE_API_URL
RUN test -n "$BASE_API_URL" || (echo 'please pass in --build-arg BASE_API_URL' && exit 1)
ENV BASE_API_URL=${BASE_API_URL}

While will place them in the environment of the built container. And all this is fine, until you start to hit the limits.

The cloudbuild.yaml command has a limit of 4000 characters, and if you have large, or sufficient number, of environment variables then you can exceed this, and we have. There is also a limit of 20 arguments to the docker build command, so again, we run into trouble if the number of environment variables gets more than that. So what can be done?

Well... since we are using Google Cloud Secrets, we could write something to scan those secrets, and pull them all into the running process, and stuff them into the process.env map for Node. But therein lies another problem: Node is asynchronous, so if we have top-level definitions that use these environment variables, like, say clients to Vendor services, then it's quite possible that they will need those variables before we have had the chance to load them.

So what can we do?

The solution that seems to work is to have a separate app that will be run in the Dockerfile, and will generate a .env file resides only in the container, and is built at the time the container is built, and contains all the environment variables we need. Then, the Node app can just use these with the dotenv library.

To make this file, we have the end of the Dockerfile look like:

# now copy everything over to the container to be made...
COPY . .
# run the node script to generate the .env file
RUN THESHOP_ENV=${THESHOP_ENV} \
  GCP_SECRETS_API_EMAIL=${GCP_SECRETS_API_EMAIL} \
  GCP_SECRETS_API_KEY=${GCP_SECRETS_API_KEY} \
  GCP_BUILD_PROJECT=${GCP_BUILD_PROJECT} \
  npm run create-env
# run the migrations for the database to keep things up to date
RUN npx migrate up --store='@platter/migrate-store'
EXPOSE 8080
CMD [ "node", "-r", "dotenv/config", "./bin/www" ]

So that we give the create-env script the few key environment variables it needs to read the Google Cloud Secrets, and then it generates the file. The create-env script is defined in the package.json as:

{
  "scripts": {
    "create-env": "node -r dotenv/config tools/make-env"
  }
}

and then the script itself is:

const arg = require('arg')
const { execSync } = require('child_process')
const { addSecretsToEnv } = require('../secrets')
const { log } = require('../logging')
 
const _help = `Help on command usage:
  npm run create-env -- --help         - show this message
  npm run create-env -- --file <name>  - where to write the env [.env]
  npm run create-env -- --verbose      - be noisy about it
 
  Nothing is required other than the FLEXBASE_ENV and some GCP env variables
  that can be specified on the command line.`;
 
/*
 * This is the main entry point for the script. We will simply read in all
 * the secrets for the THESHOP_ENV defined environment from the Cloud
 * Secrets, and then write them all to the '.env' file, as the default.
 * This will allow us to set up this environment nicely in a Dockerfile.
 */
(async () => {
  // only do this if we are run directly from 'npm run'...
  if (!module.parent) {
    // let's process the arguments and then do what they are asking
    const args = arg({
      '--help': Boolean,
      '--verbose': Boolean,
      '--file': String,
    })
    // break it into what we need
    const verbose = args['--verbose']
    const where = args['--file'] ?? '.env'
 
    // ... now let's pull in all the appropriate Secrets to the local env...
    log.info(`[makeEnv] loading the Secrets for ${process.env.THESHOP_ENV} into
        this environment...`)
    const resp = await addSecretsToEnv()
    if (verbose) {
      console.log(resp)
    }
    // ...and now we can write them out to a suitable file
    log.info(`[makeEnv] writing the environment to ${where}...`)
    const ans = execSync(`printenv > ${where}`).toString()
    if (verbose) {
      console.log(ans)
    }
    return
  }
})()

The addSecretsToEnv() is where we use the Google Secrets Node Client to read all the Secrets in our account, and one by one, pull them down and put them into process.env. The fact that this runs before the app starts is how we get around the asynchronous nature of Node, and by having it be an .env variable, we can use all the normal tools to read and process it, and we no longer need to worry about the top-level Vendor clients trying to define themselves with environment variables that haven't been defined.

Now if Node had a way to force an async function to finish before moving on, then this wouldn't be necessary, as we'd simply call the addSecretsToEnv() in the Node start-up script, well ahead of the loading of the other files. But alas... that's not how it works.

This has turned out to be a very workable solution, and we get past the limitations of the cloudbuild.yaml file, which is a great relief.

Flushing DNS Cache on macOS 13 Ventura

November 12th, 2022

Yosemite

This morning I needed to flush the DNS cache on my MacBook Pro, and so I looked it up, and wanted to keep it around, so here we are. 🙂 The problem was that a service I use had to change DNS mapping due to a change in Google Cloud, and the nature of DNS caching is to try and minimize the hits on the DNS Servers, but this makes it hard to "forget" a DNS entry... unless you flush the cache.

It's really not all that hard:

  $ sudo dscacheutil -flushcache; sudo killall -HUP mDNSResponder

and after this, the DNS cache is empty, and all services will hit the DNS server for the IP addresses, and everything will detect the "move".

Upgraded Sublime Text to Build 4142

November 10th, 2022

Sublime Text 2

With the update to macOS 13.0.1, I thought it would be a good time to check and see if Sublime Text had an update - because I thought I got notifications, but I guessed I missed this one. Still, it's nice to see that they are making updates to the editor.

I was chatting with a friend the other day, and with all the IDEs, and Super Editors out there, why Sublime? And it was interesting to me to realize it's the simplest, fastest, editor with the minimum chrome that reminds me of the old ProjectBuilder days on NeXT. The editor window is just that - a simple window, and there's no need for all the fancy ornamentation on the window... I know what's happening, and if I need it, I can make it visible. But 99% of the time, I just don't need it - so why clutter the screen?

With this update, there are a host of updates, fixes, and additions, and it's always nice to read the release notes, and realize this is a lot more powerful than I need. It's nice to have the ability - should I need it.

So all in all, a nice day of upgrades. 🙂

Interesting Node sleep() Function

October 26th, 2022

NodeJS

Today I had a reason to look at some Node issues with async processing, and ran across these two little functions that are interesting, but quite deadly. They pause the Node Runtime for the specified number of milliseconds, or seconds, and this is nice - when you have to have a delay, but all Node processing stops. This means all the async calls won't get processed, either.

  function msleep(n) {
    Atomics.wait(new Int32Array(new SharedArrayBuffer(4)), 0, 0, n);
  }
 
  function sleep(n) {
    msleep(n*1000);
  }

Simple. Easy. But not really what I was looking for. 🙂

Upgraded to macOS Ventura

October 26th, 2022

Yosemite

This morning I took the time to get macOS Ventura 13.0 up and running on my main laptop, and I'm glad I did. Lots of nice things are coming, and I'd updated my iPhone and iPad Pro to 16.1 yesterday, so today was the day for the laptop.

I did notice that Sublime Text launched much faster on macOS 13.0, and that the only wrinkle was that Safari Technology Preview wouldn't run, of course, so I had to download the latest version from Apple, and now it's back.

All in all, a successful morning.

Apple Health, COVID, and Medical Records

October 21st, 2022

Microbe

About a week ago, I got my latest Moderna COVID Booster - with the Flu Shot chaser, and I wanted to get Walgreens, my Insurance Company, and my Doctor a chance for all the bits to be pushed around, so that I could update my COVID Vaccination Card in my iOS Wallet. Today was the day.

I had done this before, but it was a while back, and so I expected a similar smooth operation, and they didn't disappoint. 🙂 I went to my Doctor's website, they had a button to pull up the COVID Proof, and I could refresh the records to show all four shots, and then get a QR Code. Pull that up in the camera, and click the link, and Bam! It's in. Just as easy as I remembered.

But the surprise is that I was able to link my iPhone's Health App to my Doctor's back-end system and receive updates, etc. Now I know there will be plenty of folks saying this isn't safe, or secure, but honestly - nothing really is - not with computers these days, as I've worked with the devs writing this code... and if anyone is going to try really hard to keep it secure, it's the largish Medical Groups, and Apple - so I feel pretty good about it.

But it's really nice to do the Auth within the Health App, and get the 2FA, and then link my Doctor's records about me to my Phone. Why? Because when I need it most - knowing that I have a Doctor - like when I'm in an accident - will really help the folks trying to help me... even when I can't speak.

And the way it so smoothly integrated with iOS... maybe I'm just jaded from my work in the field, but that was done well... and with a lot of attention to details. Very nice. 🙂

Found a Nice Async Batching Library

October 18th, 2022

NodeJS

Yesterday, I was doing a little work, and noticed that I was getting a lot of connection resets on a service that has been flawless for more than 18 months, but to be fair, the load has been rising, and after digging into the cause, it appeared that the issue was overloading the Client with so many requests, it just failed.

Typically, a client will apply back-pressure on the caller to make sure that things don't get to this point, or they will queue the requests in memory so that they will be processed, in turn, as they arrived. I'm not exactly sure what's happening, the developers of the Client are looking at this, but I needed to find something to ease the load, and so I found asyncBatch().

Let's say I had the following code:

  const balances = (await Promise.all(companies
    .map(async c => {
      const bal = await minimumDueForCompany(user, c)
      if (bal?.success && !isNil(bal?.interestDue) && bal.billDate === today) {
        bal.company = c
        return bal
      }
      return undefined
    })))
    .filter(bal => bal !== undefined)

we're running through all the items in the companies array, and for each, we are calling minimumDueForCompany() and then checking a few things, and then filtering on those that we want to see. Simple.

But if we have more than 200 elements in the companies array, and the minimumDueForCompany() employs several database queries, we could get to the point of launching more than a thousand hits at nearly the same time. If this is a background task, this might be able to starve some more important tasks with all the database aork.

A batching solution was needed. And so I went looking.

asyncBatch() follows much the same style as the Promise.all(), it just takes the values as arguments: the array, the function, and the batch size:

  const asyncBatch = require('async-batch').default
 
  const balances = (await asyncBatch(companies,
    async c => {
      const bal = await minimumDueForCompany(user, c)
      if (bal?.success && !isNil(bal?.interestDue) && bal.billDate === today) {
        bal.company = c
        return bal
      }
      return undefined
    }, 2))
    .filter(bal => bal !== undefined)

With a batch size of 2, we'll start simply, and let the background task take a little longer, while preserving the more immediate user-facing calls can have priority access.

Put this in and things are working better. It's not a perfect solution, and we still need to have the Client improved, but it gets around the two problems: Flooding the database when the use-case doesn't require it... and Failures on the Client to handle the flood. We can fine-tune the batch size later.

UPDATE: it turned out that the library launched all the work in an initial Promise.all() so it really wasn't batching the work as I'd expected. So I wrote my own using the chunk library:

  const chunk = require('chunk')
 
  /*
   * We need a function that will batch the equivalent of:
   *
   *   const resp = await Promise.all(arr.map(itm => fcn(itm)))
   *
   * but do it in batches, so that when we get a large workload, we don't
   * overwhelm the system. This is that function. The first argument is the
   * array to process, the second is the async function, that takes one
   * argument, and the last is the batch size that defaults to a reasonable
   * value.
   */
  const asyncBatch = async (arr, fcn, batchSize = 4) => {
    const ans = []
    for (const b of chunk(arr, batchSize)) {
      const blk = await Promise.all(b.map(async itm => await fcn.apply(null, [itm])))
      ans.push(...blk)
    }
    return ans
  }

This works exactly as expected, working on n of the elements at a time, and then moving to the next batch. Much cleaner.