Day 1 at the New Shop

December 1st, 2020

Bob the Builder

Today is the first day at the New Shop, and I'm a bit nervous that it's all going to be Node and React - they are tools I haven't done a lot of work in, but thanks to some help from a good friend, I feel I have a good start, and the Pragmatic Programmer's Simplifying JavaScript really is a good book to get up-to-speed on the latest changes to the language.

There's going to be a lot of learning, and it's going to be a little stressful at times, as I try to come up to speed as quickly as possible... but it's working with some very fine people, and this is the path I'm on... I need to learn all that I can - regardless of the circumstances.

I'm reminded of the chant: The King is dead. Long live the King! Life is a lot like that, it seems... and off we go! 🙂

Setting up Versioned Node Environment

November 25th, 2020

Javascript

Today I spent a little time with a good friend helping me get going on a good, versioned Node environment - a lot like RVM for Ruby - but for Node. I wanted to do this because it looks like I might be doing some work for a Node-based company where the development is all based in Node, and I wanted to make sure I got it all set up right.

I just finished reading a nice book on ES5, ES6, Promises, async and await, and all the new features of JavaScript called Simplifying JavaScript from the Pragmatic Programmers. They are having a Thanksgiving Sale, and it seemed like a great time to pick up a book that I'd probably like on the subject. I did.

It's been a long time since I spent any real time on JavaScript, and if I'm going to be taking a bit out of this project, I wanted to make sure I had come up to speed on JavaScript, and Node as well. The book was laid out well, with all the ideas based on a decent understanding of JavaScript, but not the latest additions. It really read well to me, and I was able to finish it in two days.

So, here's what I needed to do on my 16" MacBook Pro to get things up and running... 🙂

Start off by installing nodenv from Homebrew. This is the equivalent of rvm, and will manage the Node versions nicely for me.

  $ brew install nodenv

I then needed to add in the environmental set-up in my ~/.zlogin file by adding:

  # now do the nodenv stuff
  eval "$(nodenv init -)"

right after I set up my PATH and RVM environment things. It's very similar to the RVM approach, with directory-level controls, as well as system-wide defaults.

At that point, I can source my ~/.zlogin and then I'm ready to go. Next, is to install a good, long-term stable (LTS) version of Node:

  $ nodenv install 14.15.1
  $ nodenv global 14.15.1

where the second command sets that version as the global default for new projects, etc. You can always check the versions with:

  $ nodenv versions
  * 14.15.1 (set by /Users/drbob/.nodenv/version)

Next was to install a few global tools with npm that I'd need:

  $ npm install -g express-generator
  $ npm install -g nodemon
  $ nodenv rehash

where the first is the basic RESTful pattern for a service, and the latter is a way to run a Node app while monitoring the filesystem for changes to the files, and reloading them automatically. This will no-doubt prove to be exceptionally handy. The rehash command is something he's found to be nexessary when installing new global tools, as they don't seem to properly get picked up in the PATH without it. Fair enough.

At this point, we can make a new project, just to play with the new features in the book. Start by making a directory to put all this, and then use the express-generator to make the skeleton we need:

  $ mkdir playground
  $ cd playground
  $ express api --no-view

and now, in the api/ directory we have what we need to get started. Simply have npm pull everything down:

  $ cd api
  $ npm install

and we are ready to go.

There is an index.html file in the public/ directory, and we can use that... and running the Node server is as simple as:

  $ node bin/www
  ... this is the log output... 

or if we want to use the file-watching version, we can say:

  $ nodemon bin/www
  ... this is the log output... 

The port is set in the bin/www script, but I'm guessing the default is port 3000, so if you go to localhost:3000 you'll see the GET calls, and the page. Very slick... very easy. 🙂

Once I get this into a git repo, or start working on a real project/git repo, I'll see if I can get it loaded up on my iPad Pro using play.js - as it appears to be able to run all this, and have a web page to hit it... so that would be very interesting to work with - given the power of the iPad Pro, and the perfect size.

UPDATE: Indeed... once I pused the code to GitHub, and then went into play.js on my iPad Pro, I could Create a new project, from a Git Clone, and putting in the repo location, and the SSH Keys, etc. it all came down. Then it was just resolving the dependencies with the UI, and then setting the "start" command to be the npm command in the package.json, and then it ran.

Open up the play.js web browser, and it's there. On port 3000, just like it's supposed to be. And editing the file, refreshing the page - it's there. No saving, it's just there. Amazing. This is something I could get used to.

Updating Postgres to 13.1 with Homebrew

November 23rd, 2020

PostgreSQL.jpg

With the update to macOS 11 Big Sur, and the updates I keep doing to my linode box, I thought it would be a nice time to update Postgres to the latest that Homebrew had - I was expecting 12.5, as that's what's latest for Ubuntu 20.04LTS at linode, and it was surprisingly easy - even with the major version update.

The standard upgrade for anything in Homebrew is:

  $ brew upgrade postgres

and it will upgrade the binaries as well as upgrade the database files - if it's a minor release change. But if it's a major release change - like it was for me from 12.x to 13.x, then you also have to run:

  $ brew postgresql-upgrade-database

and that will update the database files and place the old database files in /usr/local/var/postgres.old so after you're sure everything is running OK, you just need to:

  $ rm -rf /usr/local/var/postgres.old

and it's all cleaned up.

The one wrinkle is that I have set up the environment variable not to do automatic cleanups of the old versions of packages - because I wanted to have multiple versions of Leiningen hanging around, so I needed to clean up the old versions of postgres with:

  $ brew cleanup postgres
  Removing: /usr/local/Cellar/postgresql/11.1_1... (3,548 files, 40.3MB)
  Removing: /usr/local/Cellar/postgresql/12.1... (3,217 files, 37.7MB)  

and then the old versions are cleaned up as well.

I was expecting 12.5... but got all the way to 13.1 - nice. 🙂

It’s Time for Christmas Music

November 16th, 2020

Christmas Tree

This morning it's time to hit the Christmas Playlist on my iPhone, as it's just not going to be a normal Thanksgiving, and so there's no reason not to feel a little of the warmth and happiness of Christmas time while we're still in lockdown.

This morning, I see the US has topped 11 mil cases, and the lag between the days with 150,000 plus new cases a day has yet to kick in. It's going to be a hard winter, and I'm lucky - I can do what I need to do without risking my life, or the lives of others. But many aren't so lucky, and this will be a continuation of the harsh times all year.

I guess it was just time to hold onto something that has meant so much to me for so many years. Nothing wrong with that. 🙂

Getting Apache 2.4.46 + PHP 7.3.22 Going on macOS 11 Bug Sur

November 13th, 2020

Yosemite

This morning, with the update to macOS 11 Big Sur it was time perform the ritual of getting the old web development tools that I've used in the past going again - this time on macOS 11. Now I haven't used PHP in ages, and this looks to be the last time I'll have to worry about this, as Apple is going to drop PHP from their releases, and they have already dropped their Postgres support in PHP. But let's get done what we can.

Activating UserDir in Apache 2.4.46

As in the previous updates, the UserDir extension is not enabled by default, so we need to get that going right away. This enables the code to be run from the development directories, and that's a big time-saver. First, we need to enable the UserDir module in Apache, and then make a specific config file for the user in question. Start by editing /etc/apache2/httpd.conf and line 184 needs to be uncommented to read:

  LoadModule userdir_module libexec/apache2/mod_userdir.so

and then similarly on line 521 uncomment the line to read:

  Include /private/etc/apache2/extra/httpd-userdir.conf

Next, make sure that the file we just included is set up right for including the user directories. Edit /etc/apache2/extra/httpd-userdir.conf and line 16 needs to be
uncommented to read:

  Include /private/etc/apache2/users/*.conf

At this point, you need to make sure you have at least one file in the /etc/apache2/users/ directory for each user, like: drbob.conf:

  <Directory "/Users/drbob/Sites/">
      Options FollowSymLinks Indexes MultiViews ExecCGI
      Require all granted
  </Directory>

where the last line - Require all granted is new as of Apache 2.4, and without it you will get errors like:

  [Thu Dec 18 10:41:32.385093 2014] [authz_core:error] [pid 55994]
    [client fe80::7a31:c1ff:fed2:ca2c:58108] AH01630: client denied by server
    configuration: /Users/drbob/Sites/info.php

Activating PHP in Apache

The mext thing to do is to activate PHP in the supplied Apache 2 with macOS 11. This is line 187 in the file - /etc/apache2/httpd.conf and you need to uncomment it to read:

  LoadModule php7_module libexec/apache2/libphp7.so

and then verify a file called /etc/apache2/other/php7.conf exists and contains:

  <IfModule php7_module>
    AddType application/x-httpd-php .php
    AddType application/x-httpd-php-source .phps
 
    <IfModule dir_module>
        DirectoryIndex index.html index.php
    </IfModule>
  </IfModule>

which does all the other PHP configuration in a separate file to make upgrades easy.

Finishing Up

At this point, a simple restart of apache:

  $ sudo apachectl restart

and everything should be in order. Hit a URL that's a static file with the contents:

  <?php
    phpinfo();
  ?>

and you should see all the details about the PHP install - it's all there, as in older releases, but it was surprising to me to see that there was no longer any support for Postgres within the PHP version that was delivered with Big Sur. More to the point - the warning is clear - PHP will be dropped in a future macOS release. Until then, we still have:

  • PHP 7.3.22
  • MySQL 5.0.12
  • SQLite3 3.28.0

so things are still there - kinda... MySQL is still supported, for those that want that, and SQLlite3, which is likely my most logical choice, but in truth... this is progress. Folks don't do PHP development like they used to, and so it's going to go away. I'll miss it, but maybe Homebrew will have something - and I remember building it all from source before... so I can do it again - if I need to.

Upgraded to AmpliFi Alien

November 13th, 2020

NetworkedWorld.jpg

A few days ago, I was running some speed tests on my iPhone 12 Pro, and noticed that the WiFi speed with connected to my Apple TimeCapsule and AirPort Extreme was about half that of connecting directly to the Xfinity xFi gateway. Given that I wanted a little more security and cohesive networking, I don't want to put everything on the Xfinity box, so it was time to upgrade my WiFi.

I've been looking at the AmpliFi Alien for quite a while, but haven't had a great reason to change - given that my TimeCapsule was also my backups with Time Machine. So first I had to move to Backblaze for backups, and that turned out to be a great move for me.

I wanted to have a place that all my versions of all my files would be stored, and with the "Forever" option at Backblaze, I can get just that. It's a little more per month, but it's exactly what I wanted, as I now have one place for all the versions of all the files on this, my main machine. It's just wonderful.

With the iOS app, I can now have access to these files - and have the peace of mind that I'll be able to look back in time for those things I might have been foolish enough to delete. I honestly don't expect to have a major data loss, but that's just when things like that happen. 🙂

With my backup issue solve, the Alien mesh arrived and it was time to install it. First, it's a beautiful piece of tech - the display is amazing, and the iOS app is amazing in what it can do, measure, all the goodies that I'm sure a current Apple router would do - if they made them. But alas, they don't. But as easy as it was to set things up, I ran into a problem with my VPN to The Shop, and that was a real pickle.

Removing the DNS Cache on AmpliFi Alien

Everything was working great - the speed tests done at the router were showing me the exact speeds that I was expecting with my Xfinity Gigabit service - a bit too asymmetrical for me, but I'm working on that, and hope that Gigabit Pro, or AT&T Fiber will be available with more symmetrical numbers, and maybe more speed. But that's another story.

The mesh was easy to set up... and upgrade the cylinders to the latest version. Almost like the Sonos set-up and control... very simple, very clear. Nice. I had to make sure all my machines had the access point in their lists, and all were talking and happy... interesting point - I had to reboot my Apple TV4K because it had the old networking (wired) DHCP address. It wouldn't refresh normally. No big deal.

But the real issue was with the OpenVPN client for The Shop. Everything seemed fine with accessing most all services, but the DNS for the shop.com domain for work weren't being resolved. Wow... OK... let's dig into this. Turns out - the Aline Router caches DNS so that it can offer you the control address of http://amplifi.lan/ from your web browser.

That's a nice touch, but if it means that the changes from the VPN didn't take... well... it was simple enough to change.

  • Go to http://amplifi.lan/ and login with the password you just set up - this is pretty easy, and while it's not obvious, a simple google search pulled this up.
  • Check the Bypass DNS Cache in the list and save - this is really not a bad idea in today's DNS hijacking environment, but it really has to be a little smarter about the existence of VPNs in the world.
  • Shut down all networking - disconnect from the VPN, turn off WiFi on the box, unplug networking - make sure it gets to a clean state.
  • Plug in network, turn on WiFi, connect to VPN - in the logic order, start the networking back up so that things are rolling again.
  • Edit /etc/hosts to add amplifi.lan - this is just to get us back to the state where we can go to http://amplifi.lan/ for the control of the router, and it's as simple as just adding a line to the /etc/hosts file where we just use the address of the Gateway, or base router in any of the DHCP address blocks we have on any of the local machines:
      192.168.153.1   amplifi.lan amplifi
      

At this point, it's all working as it should. The Router is safe and secure, and very fast. Has great diagnostics built-in to it, and available from the iOS app... and it's silent. No spinning drives like the TimeCapsule.

There may come a time that I don't need to worry about the VPN issues, or maybe they will update the firmware to more intelligently cache DNS data... that would be nice... but until then, this is exactly what I'd hoped. 🙂

Update on iPad Development

November 2nd, 2020

Linode

This morning I spent a little time happily updating my Linode box with the updated to Ubuntu 20.04, and wanted to write down what I'd found in the investigation of the "held back" updates that Ubuntu does. First, the problem. 🙂

In doing the standard:

  $ sudo apt-get update
  $ sudo apt-get upgrade

I saw that all upgrades were applied, and after a restart (what a wonderful console at Linode), all was back... kinda. I could see that there were an additional 8 packages that a needed to be upgraded, and yet the standard commands didn't seem to pick them up.

So I did a little searching, and it turns out that these packages couldn't be upgraded without installing additional packages. That's why they were being held back. Makes perfect sense to me, and thankfully, the way to fix this is very easy:

  $ sudo apt-get upgrade --with-new-pkgs

or, as I have read, you can say:

  $ sudo apt-get dist-upgrade

to do all the distribution upgrades - which will include adding packages, as needed.

And then, to clean up the old packages:

  $ sudo apt autoremove

And after a reboot, the system was completely up-to-date, and moving forward, I'll use the dist-upgrade as it's clearly the preferred mechanism moving forward. I usually do this on the weekend, just to make sure it's all updated on a regular basis.

At the same time, using tmux, Blink Shell, and Textastic on my iPad has really been quite fun to learn the extent to which these tools can be exactly what I wanted from an iPad development platform.

One of the biggest surprises was that when Blink Shell is updated from the App Store - it maintains the connections! I was completely blown away... I expected to have to fire up the connections to the host again - but Nope... the display was in the same state as it was before the update, and it worked perfectly. This is really the "Mobile Shell", and the Block Shell app is an amazing implementation on the iPad.

The next surprise was that Textastic can be pointed at the GitHub checked out repo on the remote host (no surprise), but it remembers the location of the source file, so it's one key to upload it back to the remote host (already wrote about this). But this means that I would have to hop onto the remote host and commit the changes... but with Working Copy, I can simply split-screen Textastic and Working Copy, and drag the changed files from Textastic to Working Copy, and then commit them there.

Why does this matter? Well... as of the current version of Blink Shell, it does not yet do SSH Key Forwarding, so I can't easily use my SSH key authentication into GitHub via Blink Shell. Yes, they know about this, and they say it's coming in v14, but as of today, I would have to use something like Prompt from Panic, which does enable SSH Key Forwarding. With Working Copy on my iPad, I don't have to do that... I can easily see the diffs, make the commits all from a nice UI on the iPad. Very nice. 🙂

Don't get me wrong... I'll be very excited about Blink Shell getting SSH Key Forwarding, but until it does, I'm OK... and this is just an amazingly nice platform to do the development work I really like to do. What a joy!

Another Approach for iPad Development

October 15th, 2020

Building Great Code

While apps like Buffer Editor are a good all-in-one tool, the journey that I started on has yielded some truly remarkable iPad apps for do the same things - just not all-in-one. More of a collection of tools, and they work together quite nicely.

The first is the editor - Textastic for the iPad. This is a great editor that can handle the SSH/SCP downloading of working directories on my linode host, but the real plus here is that the downloads remember where they came from, and with the SSH key, it's a simple keystroke to upload the changes to the remote host. This allows me to edit locally and auto-save to my iPad, and then a single keystroke, and the file is up on the host ready to be used there. Fantastic workflow.

At the same time, it integrates with Working Copy, a nice Git tool for the iPad, that downloads from GitHub, GitLab, BitBucket, etc. and maintains local copies of the repos on the iPad, so that you can really work on the iPad as a secondary machine. Sure, you can't compile on it, but with Textastic, you can use a nice editor (when the built-in editor isn't enough), and then use with the Textastic upload, or the Working Copy commits to get the changes to the correct place. Very slick.

But without a doubt, the very best of the tools I found was Blink Shell for the iPad. This implements the mosh protocol - Mo(bile) Sh(ell) - and it's a fascinating read. This will give me appearance of an "always on" SSH connection to the remote host, and all I need to do there is install the mosh-server. It's simply:

  $ sudo apt-get install mosh

on my Ubuntu 20.04 box at linode, and then I can just configure the connection parameters in the Blink Shell to connect with mosh, and I'm good to go. I can quit the app, I can sleep my iPad, and wake it back up, and when I start the app, it's there... just as I left off. Instantly back at the same point in the REPL, and tailing a log file (which I use tmux to set up). It's an amazing tool, and one that I'm stunned I didn't know about earlier... but in truth, I would not have needed it until the iPad.

What I am left with is similar to what Buffer Editor is doing - but it's decidedly faster to get moving, and the tools are really quite amazing in their own right. Working Copy is a more than adequate Git client, with previews for standard files, and all the configuration I would need. Using the GUI for commits, as opposed to my usual command-line is nice, and the fact that it connects to GitHub to see what repos I have to clone is an added bonus that tells me I don't need to copy a bunch of URLs to clone them.

Textastic has been in the App Store for 10 yrs, and it's remembering of where the files came from, and one-keystroke upload is so clever... it's honestly a feature I hadn't even imagined - but it's exactly what I was looking for. True delight to use it. And the integration with Working Copy is very nice so that I get the best of both.

Blink Shell with mosh and tmux are really the winners, though... the panes allow me to have a REPL in the top three-quarters, and a tailed log file in the bottom fourth, and never having to worry about having enough space on the screen. The speed of returning to development after an hour, or a day, is just amazing. These tools have made the value of linode servers jump up considerably in my mind. This would allow me to work on several projects, each on a small node, and be able to talk to one another - with Postgres on each node. It's really quite amazing. 🙂

Now I just need some time to work on these projects. Fear not, Advent of Code will be here sooner than you think!

Setting Up Linode and Buffer Editor

October 12th, 2020

Linode

It's been fun to get access to the beta of GitHub's Codespaces, but one of its short-falls is that when you run an outward-facing service - like a Jetty server, the IDE understands the port needs forwarding, but on the iPad, in Safari, there's really no way to forward a port. Additionally, the IP address of the container on the back-end isn't opened up for answering those forwarded requests. So while it's a great idea for development on an iPad - you can't really do any RESTful service development.

Additionally, the nice thing about it being all browser-based, is also a limitation in that it's browser-based, and no local storage. This means that there is no offline mode for editing, and while we can't (yet) compile on the iPad, you can edit - if the files are local, and without local storage, you don't have that.

So I went looking and found a very close match to what I might write: Buffer Editor. It's on iOS and iPadOS, and it allows for local and remote editing from an incredible number of sources - Dropbox, iCloud, GitHub, BitBucket, etc. For example, you can clone a GitHub repo right on your iPad, and then edit it locally, and push up changes. You can also set up an SSH/SFTP connection and remote edit the files, and even have a terminal session.

This is a lot like Panic's Code Editor for iOS, but Buffer Editor handles general Git access, and Code Editor does not. Also, Buffer Editor handles Clojure syntax, and Code Editor doesn't.

I was able to write to the Buffer Editor folks, and give them updated rules for Clojure, and within a week, they had an update out, and the changes were there. That's some impressive support. I have done the same with Panic, but I haven't heard back yet - there, I know they know Git support is important, so I'm thinking they may not be really supporting Code Editor on iOS as much... that would be a shame.

Still, Buffer Editor is working great - but I needed to have a host on the back-end to be able to do the work. I wasn't a huge fan of AWS, so I decided to try Linode, and I'm so very happy that I did! 🙂

Linode is a lot like AWS - with a somewhat limited feature set. You can get machines - either shared CPUs, or dedicated ones... and you can pick from a lot of different styles: compute, big memory, GPUs, etc. It's all most folks would need for most projects. They also have lots of SSD disk space - like NFS, and they also have an S3-like object storage. Add in load balancers, and it's enough to do most of the things you need - if you roll your own services like database, etc.

They also had a nice introductory offer so I decided to take it for a spin, and see what Buffer Editor could do with a nice Ubuntu 20.04 back-end with AdoptOpenJDK 11, and Clojure.

The basic instructions are very clear, and it was easy enough to set up the root account, and get the box running. It was even a little faster than AWS. Once I had the box running, I logged into the box, and updated everything on the box:

  (macbook) $ ssh root@123.45.67.88
  (linode) $ apt-get update && apt-get upgrade

With all that OK, I then set the hostname, updated the /etc/hosts file, made my user account, and got ready for moving my SSH key:

  (linode) $ vi /etc/hosts
             ... adding: 123.45.67.88   opus.myhome.com
  (linode) $ adduser drbob
  (linode) $ adduser drbob sudo

and then as me:

  (macbook) $ ssh drbob@123.45.67.88
  (linode) $ mkdir -p ~/.ssh && chmod -R 700 ~/.ssh

and then back on my laptop, I send the keys over:

  (macbook) $ scp ~/.ssh/id_rsa.pub drbob@123.45.67.88:~/.ssh/authorized_keys

And then I can add any options to the /etc/sudoers file as the command above put my new user in the sudo group, but there could be tweaks you might want to make there.

At this point, sudo was working on my account on the Linode box, and then it was time to lock down the machine a little:

  (linode) $ vi /etc/ssh/sshd_config
             ... PermitRootLogin no
                 PasswordAuthentication no
                 AddressFamily inet
  (linode) $ service ssh restart

At this point, I could install the other packages I needed:

  (linode) $ apt-get -y install --no-install-recommends openjdk-11-jdk leiningen grc mosh
  (linode) $ apt-get -y install --no-install-recomments postgresql postgresql-contrib

Then I can make a new user for Postgres with:

  (linode) $ sudo -u postgres createuser --superuser drbob

and then I can clone all the repos I need. The box is ready to go.

With this, I can now edit offline on my iPad, and then push, or copy the files when I get to a network connection, and then I can edit and debug as much as I'd like when I do have connection. It's almost exactly what I was hoping for.

The one missing thing: panes for the terminals... I'd like to have a REPL, and a tailing of the log file visible at the same time. I think I can accomplish that with the screen command, but I'll have to experiment with it a lot more to find out. But it's close... very close... 🙂

Looking for clj/deps REPL Tools

September 29th, 2020

Clojure.jpg

This morning I did a little looking for any tools that worked in the clj/deps system that was like the Leiningen plugins that did color syntax highlighting, and data formatting that I find very useful when the result of a function is a complicated data structure, or a sequence. I have been using whidbey as it works just fine with JDK 11 and any Clojure past 1.8.

The issue is that a good friend switched to the clj/deps package management and build tool for Clojure, and likes it quite a lot. So I've been trying to built up a way to do the same kinds of things in that ecosystem as with Leiningen. I didn't have a lot of luck the last time I looked, but maybe something was out there now? Or maybe I'd get a little lucky on the words I used in the search?

I found something that was close: rebel-readline. This is nice in that it's a simple readline replacement for the REPL, so it's not something that has to have a special environment - or a different app. I liked that it was able to do a lot of things on the readline that whidbey couldn't - because whidbey wasn't really active on the "read part" of the REPL - but more really on the "print part". This was nice, and I really liked that it was capable of showing the Clojure docs on the function - and it was fast, so that's a plus... but almost by definition - it wouldn't do anything on the output. So that wasn't as successful as I'd hoped.

Then a friend pointed out Reveal - and this was a lot more than I was looking for - the graphs, the processing of the output... this is a lot more like Gorilla REPL, that I've used, and written about in the past. It's all done in a browser, and Reveal is in a terminal session - so it's not exactly the same - but it's certainly something that is more of a joint "formatter and visualization tool" than just syntax highlighting and formatting of the REPL output.

I'll keep the links around, and maybe someone will write something like rebel-readline - but for the print (output) loop for the REPL... that would be nice. Until then, we carry on...