Archive for the ‘Coding’ Category

Flushing DNS Cache on macOS 13 Ventura

Saturday, November 12th, 2022

Yosemite

This morning I needed to flush the DNS cache on my MacBook Pro, and so I looked it up, and wanted to keep it around, so here we are. 🙂 The problem was that a service I use had to change DNS mapping due to a change in Google Cloud, and the nature of DNS caching is to try and minimize the hits on the DNS Servers, but this makes it hard to "forget" a DNS entry... unless you flush the cache.

It's really not all that hard:

  $ sudo dscacheutil -flushcache; sudo killall -HUP mDNSResponder

and after this, the DNS cache is empty, and all services will hit the DNS server for the IP addresses, and everything will detect the "move".

Upgraded Sublime Text to Build 4142

Thursday, November 10th, 2022

Sublime Text 2

With the update to macOS 13.0.1, I thought it would be a good time to check and see if Sublime Text had an update - because I thought I got notifications, but I guessed I missed this one. Still, it's nice to see that they are making updates to the editor.

I was chatting with a friend the other day, and with all the IDEs, and Super Editors out there, why Sublime? And it was interesting to me to realize it's the simplest, fastest, editor with the minimum chrome that reminds me of the old ProjectBuilder days on NeXT. The editor window is just that - a simple window, and there's no need for all the fancy ornamentation on the window... I know what's happening, and if I need it, I can make it visible. But 99% of the time, I just don't need it - so why clutter the screen?

With this update, there are a host of updates, fixes, and additions, and it's always nice to read the release notes, and realize this is a lot more powerful than I need. It's nice to have the ability - should I need it.

So all in all, a nice day of upgrades. 🙂

Upgraded to macOS Ventura

Wednesday, October 26th, 2022

Yosemite

This morning I took the time to get macOS Ventura 13.0 up and running on my main laptop, and I'm glad I did. Lots of nice things are coming, and I'd updated my iPhone and iPad Pro to 16.1 yesterday, so today was the day for the laptop.

I did notice that Sublime Text launched much faster on macOS 13.0, and that the only wrinkle was that Safari Technology Preview wouldn't run, of course, so I had to download the latest version from Apple, and now it's back.

All in all, a successful morning.

Found a Nice Async Batching Library

Tuesday, October 18th, 2022

NodeJS

Yesterday, I was doing a little work, and noticed that I was getting a lot of connection resets on a service that has been flawless for more than 18 months, but to be fair, the load has been rising, and after digging into the cause, it appeared that the issue was overloading the Client with so many requests, it just failed.

Typically, a client will apply back-pressure on the caller to make sure that things don't get to this point, or they will queue the requests in memory so that they will be processed, in turn, as they arrived. I'm not exactly sure what's happening, the developers of the Client are looking at this, but I needed to find something to ease the load, and so I found asyncBatch().

Let's say I had the following code:

  const balances = (await Promise.all(companies
    .map(async c => {
      const bal = await minimumDueForCompany(user, c)
      if (bal?.success && !isNil(bal?.interestDue) && bal.billDate === today) {
        bal.company = c
        return bal
      }
      return undefined
    })))
    .filter(bal => bal !== undefined)

we're running through all the items in the companies array, and for each, we are calling minimumDueForCompany() and then checking a few things, and then filtering on those that we want to see. Simple.

But if we have more than 200 elements in the companies array, and the minimumDueForCompany() employs several database queries, we could get to the point of launching more than a thousand hits at nearly the same time. If this is a background task, this might be able to starve some more important tasks with all the database aork.

A batching solution was needed. And so I went looking.

asyncBatch() follows much the same style as the Promise.all(), it just takes the values as arguments: the array, the function, and the batch size:

  const asyncBatch = require('async-batch').default
 
  const balances = (await asyncBatch(companies,
    async c => {
      const bal = await minimumDueForCompany(user, c)
      if (bal?.success && !isNil(bal?.interestDue) && bal.billDate === today) {
        bal.company = c
        return bal
      }
      return undefined
    }, 2))
    .filter(bal => bal !== undefined)

With a batch size of 2, we'll start simply, and let the background task take a little longer, while preserving the more immediate user-facing calls can have priority access.

Put this in and things are working better. It's not a perfect solution, and we still need to have the Client improved, but it gets around the two problems: Flooding the database when the use-case doesn't require it... and Failures on the Client to handle the flood. We can fine-tune the batch size later.

UPDATE: it turned out that the library launched all the work in an initial Promise.all() so it really wasn't batching the work as I'd expected. So I wrote my own using the chunk library:

  const chunk = require('chunk')
 
  /*
   * We need a function that will batch the equivalent of:
   *
   *   const resp = await Promise.all(arr.map(itm => fcn(itm)))
   *
   * but do it in batches, so that when we get a large workload, we don't
   * overwhelm the system. This is that function. The first argument is the
   * array to process, the second is the async function, that takes one
   * argument, and the last is the batch size that defaults to a reasonable
   * value.
   */
  const asyncBatch = async (arr, fcn, batchSize = 4) => {
    const ans = []
    for (const b of chunk(arr, batchSize)) {
      const blk = await Promise.all(b.map(async itm => await fcn.apply(null, [itm])))
      ans.push(...blk)
    }
    return ans
  }

This works exactly as expected, working on n of the elements at a time, and then moving to the next batch. Much cleaner.

Adding Let’s Encrypt Certs to Nginx

Thursday, October 13th, 2022

Linode

This morning I had some time and wanted to finish up the work of getting my Cloud VM running Ubuntu 22.04 working just fine as a development box - including inbound webhooks from vendors, and calls from apps like HTTPbot on my iPad Pro. The key was that I needed to be able to install and configure nginx to forward all port 443 traffic to port 6543, and that also meant getting the nginx server to be listening on port 443 with a legit certificate.

Turns out, it wasn't as bad as I thought it might be. 🙂

Starting with my Ubuntu 22.04 install, I added the packages I was going to need, based on this blog post on the nginx site.

  $ sudo apt-get -y install --no-install-recommends nginx certbot python3-certbot-nginx

Once these are installed, we could set the server_name in the nginx config:

  $ sudo /etc/nginx/sites-enabled/default

and update the server_name line to be:

  server_name mybox.mydomain.com;

and then we can get the initial certificate from Let's Encrypt and register a new email account with them with:

  $ sudo certbot --nginx -d mybox.mydomain.com -d mydomain.com

and the second -d argument is for an additional domain for the certificate. I didn't need it, so I just had the one -d pair on my certbot command.

After this, we edit the config file again, updating the port 443 section's location specification with:

  location / {
    # forward all HTTPS traffic to port 6543
    proxy_set_header  X-Forward-For $remote_addr;
    proxy_set_header  Host $http_host;
    proxy_pass        "http://127.0.0.1:6543";
  }

and then verify the nginx config with:

  $ sudo nginx -t

and then tell nginx to reload the config with:

  $ sudo nginx -s reload

At this point, the box is answering HTTPS traffic, and forwarding it on to the Node service at port 6543. Excellent. 🙂

In order to refresh the Let's Encrypt Certificate on time, let's add a simple crontab entry:

  $ crontab -e

and then have the entries:

  # run all the commands on Bash not Bourne Shell
  SHELL=/bin/bash
  # send all the mail to my main account
  MAILTO=bob@mydomain.com
 
  # check the Let's Encrypt certificate each dat at noon UTC
  0 12 * * *   sudo /usr/bin/certbot renew --quiet

And that should do it.

Setting up iPad Pro Development

Tuesday, October 11th, 2022

IPadPro

I have been doing some manual tasks that I decided today I really wanted to automate. The problem was, it was updating a few rows in a database, and so I didn't really want to expose an endpoint that achieve this, just on the very slim chance that the service might get hacked. I know, we protect these things carefully with Google Cloud Tasks, but I just felt these tasks needed to be done outside the scope of the running service, and it was possible, but tiring, to do it manually. First, with psql, and then with a bash script which took command-line options, and then called psql, and made the updates.

In all, it wasn't a bad idea. And it was working just fine - except for the tired part. 🙂 So while I didn't want to set up a crontab job on my laptop, I could set up one on a Cloud VM that is only accessible by me based on an RSA key. That is secure, and it's out on the network, so I don't have to worry about my internet service going down.

So I needed to have some way to replicate my entire development environment onto my iPad Pro, and then from there use the tools I've pulled together to try and make this happen, and decided to push it that last little bit.

Starting with Working Copy on the iPad, I have a complete, stand-alone Git client that has created local directories on my iPad so that an editor like Textastic on the iPad can access these files, save to them, and Working Copy will detect the changes. Additionally, Textastic can upload the files to a hope using scp, so I can edit and save, upload and test, and then make a PR, and push up to GitHub.

I needed to be able to run Node, and a few more watching commands on the box, so I had to get everything up on the Ubuntu 22.04 box in the Cloud, and then wrangle everything to get it all going. The most troublesome thing is that mosh, the Mobile Shell that I use with Blink on my iPad, doesn't allow for SSH Key Forwarding. It's something about mosh, and I understand the reason, but I also know the mosh folks are working to fix this - but it is annoying not to be able to use git within mosh because the SSH keys aren't carried up in the communication stack.

Still, with conventional ssh, they are carried, and I can use that, as needed. Someday soon, there will be a way to use something like Guardian Angel for the SSH key forwarding in mosh, but for now, this works.

At this point, I can edit locally... push to the cloud, have nodemon restart the server on any change... and hit the service on the standard Node port 6543. But then I needed to get nginx forwarding HTTPS to port 6543, and that's another challenge for another day.

What I can do is to run the bash scripts on the Cloud VM, and then crontab the runs so that I can never again have to worry about being so tired from running these commands at odd hours. 🙂

Upgraded Postgres to 14.4 on Homebrew

Thursday, August 11th, 2022

PostgreSQL.jpg

Nothing major, but I did notice that 14.4 was the stable release of Postgres, and so I decided it was time to upgrade to the latest from Homebrew. And it really is very easy (again):

  $ brew upgrade postgresql

and then after all the downloading and installing, we have:

  $ psql --version
  psql (PostgreSQL) 14.4

Then we are good to go! 🙂

Interestingly, this time I didn't have to run:

  $ brew services restart postgresql

for when I did, it told me it was already running. Nice - it restarted on it's own. 🙂

Play.js Updated to CodeSandbox

Wednesday, May 4th, 2022

PlayJs

I was doing a little Node/JS coding on play.js on my iPad Pro this week, and ran into a few issues that I wrote to the developers about. They weren't all that big a deal, save one:

  • Logging is iffy - using Node/Express, their default is the debug logging package, and yet you can't see any of the log messages in the console in the app.
  • Running nodemon doesn't reload on changes - it would be nice to have a way to auto-reload changes in the files - specifically because the editor is saving them.
  • Exceptions aren't logged - if there's an uncaught exception, it's not logged/printed in the console at all. Just silence.

And it's really the last one that's the kicker... no way to see if there has been any exceptions... that's something that would make it very hard to find errors in the code.

They wrote back that there would be updates that were coming soon that would fix most of these, and that I should sign up for the CodeSandbox Beta program and look for the updates. Well... this morning, I saw that they had an update, and changed the name of the app on iPadOS to CodeSandbox. So I fired it up to see how things had changed.

There were lots of changes, but the key problems I was having haven't changed, and they have really moved it away from what I liked about it, and towards another style, with different goals. It's OK... it's their app, but it's not the direction I was hoping they were going.

So... it looks like I'll have to wait a little longer to see what comes up as a development platform for the iPad...

Upgraded Postgres to 14.2 on Homebrew

Wednesday, April 20th, 2022

PostgreSQL.jpg

This morning I was looking into the Ubuntu 20.04 to 22.04 LTS upgrades, and decided that it was probably a good time to see about the latest version of Postgresql for my laptop. Thankfully, it's a minor version upgrade from 14.0 to 14.2, and this is something that Homebrew can do quite nicely:

  $ brew upgrade postgresql
  $ brew services restart postgresql

and then after all the downloading and installing, we have:

  $ psql --version
  psql (PostgreSQL) 14.2

Then we are good to go! 🙂

I haven't had a need to use the local Postgres server a lot in the last year or so - using a Google Cloud instance for a while, but it's nice to have all the client support, and to have something local as that's still the best insurance to off-the-grid development.

Nice Config Change for Sublime Text

Thursday, February 10th, 2022

Sublime Text 2

This morning I did a little searching about how to disable any Auto-Complete in Sublime Text, as there are a lot of times that I really don't want to have any autocomplete happening, as it just gets in my way. So I was very happy to find that all I needed to do was to add:

  "auto_complete": false

to the Settings file, and that will turn it off.

Blissful silence. 🙂

Thankfully, Ctrl-Space will bring it back, and that's exactly the thing I was hoping to find. Then it's at my command, as opposed to simply appearing. Once again, less is more... coding is what I do, and I know what to type... so Sublime Text is again my favorite editor. Wonderful. 🙂