Archive for October, 2022

Interesting Node sleep() Function

Wednesday, October 26th, 2022

NodeJS

Today I had a reason to look at some Node issues with async processing, and ran across these two little functions that are interesting, but quite deadly. They pause the Node Runtime for the specified number of milliseconds, or seconds, and this is nice - when you have to have a delay, but all Node processing stops. This means all the async calls won't get processed, either.

  function msleep(n) {
    Atomics.wait(new Int32Array(new SharedArrayBuffer(4)), 0, 0, n);
  }
 
  function sleep(n) {
    msleep(n*1000);
  }

Simple. Easy. But not really what I was looking for. 🙂

Upgraded to macOS Ventura

Wednesday, October 26th, 2022

Yosemite

This morning I took the time to get macOS Ventura 13.0 up and running on my main laptop, and I'm glad I did. Lots of nice things are coming, and I'd updated my iPhone and iPad Pro to 16.1 yesterday, so today was the day for the laptop.

I did notice that Sublime Text launched much faster on macOS 13.0, and that the only wrinkle was that Safari Technology Preview wouldn't run, of course, so I had to download the latest version from Apple, and now it's back.

All in all, a successful morning.

Apple Health, COVID, and Medical Records

Friday, October 21st, 2022

Microbe

About a week ago, I got my latest Moderna COVID Booster - with the Flu Shot chaser, and I wanted to get Walgreens, my Insurance Company, and my Doctor a chance for all the bits to be pushed around, so that I could update my COVID Vaccination Card in my iOS Wallet. Today was the day.

I had done this before, but it was a while back, and so I expected a similar smooth operation, and they didn't disappoint. 🙂 I went to my Doctor's website, they had a button to pull up the COVID Proof, and I could refresh the records to show all four shots, and then get a QR Code. Pull that up in the camera, and click the link, and Bam! It's in. Just as easy as I remembered.

But the surprise is that I was able to link my iPhone's Health App to my Doctor's back-end system and receive updates, etc. Now I know there will be plenty of folks saying this isn't safe, or secure, but honestly - nothing really is - not with computers these days, as I've worked with the devs writing this code... and if anyone is going to try really hard to keep it secure, it's the largish Medical Groups, and Apple - so I feel pretty good about it.

But it's really nice to do the Auth within the Health App, and get the 2FA, and then link my Doctor's records about me to my Phone. Why? Because when I need it most - knowing that I have a Doctor - like when I'm in an accident - will really help the folks trying to help me... even when I can't speak.

And the way it so smoothly integrated with iOS... maybe I'm just jaded from my work in the field, but that was done well... and with a lot of attention to details. Very nice. 🙂

Found a Nice Async Batching Library

Tuesday, October 18th, 2022

NodeJS

Yesterday, I was doing a little work, and noticed that I was getting a lot of connection resets on a service that has been flawless for more than 18 months, but to be fair, the load has been rising, and after digging into the cause, it appeared that the issue was overloading the Client with so many requests, it just failed.

Typically, a client will apply back-pressure on the caller to make sure that things don't get to this point, or they will queue the requests in memory so that they will be processed, in turn, as they arrived. I'm not exactly sure what's happening, the developers of the Client are looking at this, but I needed to find something to ease the load, and so I found asyncBatch().

Let's say I had the following code:

  const balances = (await Promise.all(companies
    .map(async c => {
      const bal = await minimumDueForCompany(user, c)
      if (bal?.success && !isNil(bal?.interestDue) && bal.billDate === today) {
        bal.company = c
        return bal
      }
      return undefined
    })))
    .filter(bal => bal !== undefined)

we're running through all the items in the companies array, and for each, we are calling minimumDueForCompany() and then checking a few things, and then filtering on those that we want to see. Simple.

But if we have more than 200 elements in the companies array, and the minimumDueForCompany() employs several database queries, we could get to the point of launching more than a thousand hits at nearly the same time. If this is a background task, this might be able to starve some more important tasks with all the database aork.

A batching solution was needed. And so I went looking.

asyncBatch() follows much the same style as the Promise.all(), it just takes the values as arguments: the array, the function, and the batch size:

  const asyncBatch = require('async-batch').default
 
  const balances = (await asyncBatch(companies,
    async c => {
      const bal = await minimumDueForCompany(user, c)
      if (bal?.success && !isNil(bal?.interestDue) && bal.billDate === today) {
        bal.company = c
        return bal
      }
      return undefined
    }, 2))
    .filter(bal => bal !== undefined)

With a batch size of 2, we'll start simply, and let the background task take a little longer, while preserving the more immediate user-facing calls can have priority access.

Put this in and things are working better. It's not a perfect solution, and we still need to have the Client improved, but it gets around the two problems: Flooding the database when the use-case doesn't require it... and Failures on the Client to handle the flood. We can fine-tune the batch size later.

UPDATE: it turned out that the library launched all the work in an initial Promise.all() so it really wasn't batching the work as I'd expected. So I wrote my own using the chunk library:

  const chunk = require('chunk')
 
  /*
   * We need a function that will batch the equivalent of:
   *
   *   const resp = await Promise.all(arr.map(itm => fcn(itm)))
   *
   * but do it in batches, so that when we get a large workload, we don't
   * overwhelm the system. This is that function. The first argument is the
   * array to process, the second is the async function, that takes one
   * argument, and the last is the batch size that defaults to a reasonable
   * value.
   */
  const asyncBatch = async (arr, fcn, batchSize = 4) => {
    const ans = []
    for (const b of chunk(arr, batchSize)) {
      const blk = await Promise.all(b.map(async itm => await fcn.apply(null, [itm])))
      ans.push(...blk)
    }
    return ans
  }

This works exactly as expected, working on n of the elements at a time, and then moving to the next batch. Much cleaner.

Got New Moderna Booster & Flu Shot

Sunday, October 16th, 2022

Microbe

Today it was time to get the latest Moderna COVID Booster, and back it up with the standard Flu Shot. It's been a while since my last booster, and with the new variants, it was clearly time to schedule a visit to Walgreens just down the street, where it's so easy to get the hosts.

So in a few weeks, I'll be all set for Thanksgiving which is about my only social event of the Fall, and I'll be safe from the worst of it. Science is a good thing. 🙂

Adding Let’s Encrypt Certs to Nginx

Thursday, October 13th, 2022

Linode

This morning I had some time and wanted to finish up the work of getting my Cloud VM running Ubuntu 22.04 working just fine as a development box - including inbound webhooks from vendors, and calls from apps like HTTPbot on my iPad Pro. The key was that I needed to be able to install and configure nginx to forward all port 443 traffic to port 6543, and that also meant getting the nginx server to be listening on port 443 with a legit certificate.

Turns out, it wasn't as bad as I thought it might be. 🙂

Starting with my Ubuntu 22.04 install, I added the packages I was going to need, based on this blog post on the nginx site.

  $ sudo apt-get -y install --no-install-recommends nginx certbot python3-certbot-nginx

Once these are installed, we could set the server_name in the nginx config:

  $ sudo /etc/nginx/sites-enabled/default

and update the server_name line to be:

  server_name mybox.mydomain.com;

and then we can get the initial certificate from Let's Encrypt and register a new email account with them with:

  $ sudo certbot --nginx -d mybox.mydomain.com -d mydomain.com

and the second -d argument is for an additional domain for the certificate. I didn't need it, so I just had the one -d pair on my certbot command.

After this, we edit the config file again, updating the port 443 section's location specification with:

  location / {
    # forward all HTTPS traffic to port 6543
    proxy_set_header  X-Forward-For $remote_addr;
    proxy_set_header  Host $http_host;
    proxy_pass        "http://127.0.0.1:6543";
  }

and then verify the nginx config with:

  $ sudo nginx -t

and then tell nginx to reload the config with:

  $ sudo nginx -s reload

At this point, the box is answering HTTPS traffic, and forwarding it on to the Node service at port 6543. Excellent. 🙂

In order to refresh the Let's Encrypt Certificate on time, let's add a simple crontab entry:

  $ crontab -e

and then have the entries:

  # run all the commands on Bash not Bourne Shell
  SHELL=/bin/bash
  # send all the mail to my main account
  MAILTO=bob@mydomain.com
 
  # check the Let's Encrypt certificate each dat at noon UTC
  0 12 * * *   sudo /usr/bin/certbot renew --quiet

And that should do it.

Setting up iPad Pro Development

Tuesday, October 11th, 2022

IPadPro

I have been doing some manual tasks that I decided today I really wanted to automate. The problem was, it was updating a few rows in a database, and so I didn't really want to expose an endpoint that achieve this, just on the very slim chance that the service might get hacked. I know, we protect these things carefully with Google Cloud Tasks, but I just felt these tasks needed to be done outside the scope of the running service, and it was possible, but tiring, to do it manually. First, with psql, and then with a bash script which took command-line options, and then called psql, and made the updates.

In all, it wasn't a bad idea. And it was working just fine - except for the tired part. 🙂 So while I didn't want to set up a crontab job on my laptop, I could set up one on a Cloud VM that is only accessible by me based on an RSA key. That is secure, and it's out on the network, so I don't have to worry about my internet service going down.

So I needed to have some way to replicate my entire development environment onto my iPad Pro, and then from there use the tools I've pulled together to try and make this happen, and decided to push it that last little bit.

Starting with Working Copy on the iPad, I have a complete, stand-alone Git client that has created local directories on my iPad so that an editor like Textastic on the iPad can access these files, save to them, and Working Copy will detect the changes. Additionally, Textastic can upload the files to a hope using scp, so I can edit and save, upload and test, and then make a PR, and push up to GitHub.

I needed to be able to run Node, and a few more watching commands on the box, so I had to get everything up on the Ubuntu 22.04 box in the Cloud, and then wrangle everything to get it all going. The most troublesome thing is that mosh, the Mobile Shell that I use with Blink on my iPad, doesn't allow for SSH Key Forwarding. It's something about mosh, and I understand the reason, but I also know the mosh folks are working to fix this - but it is annoying not to be able to use git within mosh because the SSH keys aren't carried up in the communication stack.

Still, with conventional ssh, they are carried, and I can use that, as needed. Someday soon, there will be a way to use something like Guardian Angel for the SSH key forwarding in mosh, but for now, this works.

At this point, I can edit locally... push to the cloud, have nodemon restart the server on any change... and hit the service on the standard Node port 6543. But then I needed to get nginx forwarding HTTPS to port 6543, and that's another challenge for another day.

What I can do is to run the bash scripts on the Cloud VM, and then crontab the runs so that I can never again have to worry about being so tired from running these commands at odd hours. 🙂