Creating a Demo Movie

January 17th, 2019

iMovie.jpg

This week has been a Hackathon at The Shop, and I was asked to do one on a cross-domain, secure, component-based Web App Framework based on the work done at PayPal in the kraken.js team. It was a steep learning curve for me and the others on the Hackathon team, as none of us had any real experience with Node.js or React, but we had only this week to get something going.

The good news is that we got everything we needed to get running late yesterday, and today I started work on the Demo presentation which happens tomorrow, but it's videos, each group submitting one video. The only limitation is that the video has to be less than 5 min in length - and that's a hard limit I'm told.

OK... so I was looking at all the screen capture tools on the App Store, and some of them looked pretty decent, but the good one was $250, and while I might go that high - I wanted it to be amazing for $250... like OmniGraffle good. And I saw a lot of really iffy reviews. So that didn't seem like the way to go.

Due to the fact that I needed to be able to add in slides and screen grabs, I knew I needed more than a simple "start recording" that Google Hangouts does, and with nothing really obvious in the App Store or in Google searches... well... I hit up a few of my friends in production of stuff like this. Funny thing... they said "QuickTime Player and iMovie"

This really blew me away... I mean I knew about iMovie, but I didn't know that QuickTime Player did screen recordings - and with a selectable region on the screen. And that it also did auto recordings - again, I'm going to need to be able to do voice-overs on the slides and things happening on the screen in the demo.

So I started recording clips. Keynote was nice in that I could make all the slides there, and export them as JPEG files, and they imported perfectly into iMovie. Then I could put them in the timeline for exactly how long I needed them, and do any transitions I needed to make it look nice.

Then I went into a little phone booth we have at The Shop, and recorded the audio with very little background noise. I could then re-record the audio clips as needed to make it all fit in the 5 min hard limit. In the end, I could export the final movie, and upload it to the Google Drive for the submissions.

Don't get me wrong... there was a steep learning curve for iMovie for about an hour. How to move, select, add things, remove things... not obvious, but with a little searching and experimenting, I got the hang of the essentials. And honestly, that's all I needed to finish this Demo video.

I was totally blown away in the end. I was able to really put something nice together with a minimum of fuss, and now that I have the essentials in-hand, it'll be far easier next time. Just amazingly powerful tools from Apple - all installed on each new Mac. If only more people knew...

When is a Free Lunch not Free?

January 15th, 2019

cow.jpg

I'm supportive of places that provide perks like free lunch, shuttle trips to mass transit, things that can really nickel-and-dime employees, and aren't that horrible, but are nice perks. Just nice. Today at The Shop the lunch was... well... I know several folks that liked it, and thought it was really good. But it was way too cheesy for me.

I've been told that I'd never last in Wisconsin, and I have to agree... I probably would draw more than a few odd looks for my love of milk and ice cream, but not cheese. Just no thanks.

So today's free lunch wasn't free to me. But they did have the Honey Nut Cheerios that I grabbed a fist-flu of to cleanse my palette after the cheese.

Simple Immutable Data in Postgres

January 14th, 2019

PostgreSQL.jpg

A friend asked me today for the trick I've used in the past to make sure that the data in a database is immutable and versioned. This is nice because it matches what Clojure acts like, internally, and it makes it easy to see who's doing what, when. Assuming we start with a table - for the sake of argument, let's say it's customers, and looks something like:

  CREATE TABLE IF NOT EXISTS customer (
    id             uuid NOT NULL,
    version        INTEGER NOT NULL,
    as_of          TIMESTAMP WITH TIME zone NOT NULL,
    last_name      VARCHAR,
    first_name     VARCHAR,
    PRIMARY KEY (id, version, as_of)
  );

where the first three fields are really the key ones for any table to work with this scheme. You need to have a unique key, and in this case, the easiest I've found is a UUID, and then you need a version and a timestamp when that change was made. What's left in the table is really not important here, but it's something.

You then need to make an Audit table that has the same table structure, but has the string _audit added to the name:

  CREATE TABLE IF NOT EXISTS customer_audit LIKE customer including ALL;

What we then need to create is the following trigger that intercepts the INSERT and UPDATE commands on the customers table, and places the historical data into the Audit table, and the most recent version of the customer data is always kept in the customer table.

The trigger looks like this:

  --
  -- create the trigger on INSERT and UPDATE to update the version if it's
  -- not provided, and to maintain all versions in the audit table, but have
  -- the current version in the non-audit table. Importantly, NOTHING is
  -- deleted.
  --
  CREATE OR REPLACE FUNCTION audit_customer()
  RETURNS TRIGGER AS $body$
  DECLARE
    ver INTEGER;
  BEGIN
    -- get the advisory lock on this id
    perform pg_advisory_xact_lock(('x' || translate(LEFT(NEW.id::text, 18),
                                  '-', ''))::bit(64)::BIGINT);
 
    -- get the max of the existing version for the data now
    SELECT MAX(version) INTO ver
      FROM customer_audit
     WHERE id = NEW.id;
    -- and bump it up one and use that
    IF ver IS NULL THEN
      NEW.version := 1;
    ELSE
      NEW.version := ver + 1;
    END IF;
 
    -- if an update, then we need to insert the new
    IF tg_op = 'UPDATE' THEN
      -- now let's insert the old row into the audit table
      INSERT INTO customer_audit
        VALUES (NEW.*);
    elsif tg_op = 'INSERT' THEN
      -- now let's insert the new row into the audit table
      INSERT INTO customer_audit
        VALUES (NEW.*);
      -- and delete the old one in the customer table
      DELETE FROM customer
        WHERE id = NEW.id
          AND version <= ver;
    END IF;
 
    -- finally, return the row to be inserted to customer
    RETURN NEW;
  END
  $body$ LANGUAGE plpgsql;
 
  CREATE TRIGGER set_version BEFORE INSERT OR UPDATE ON customer
    FOR each ROW EXECUTE PROCEDURE audit_customer();

At this point, we can INSERT or UPDATE on customers and the previous version of that customer will be mmoved to the Audit table, and the the most recent version will be held in the customers table.

I have found this very useful, and I've put it in a gist for easy access.

The point of:

    -- get the advisory lock on this id
    perform pg_advisory_xact_lock(('x' || translate(LEFT(NEW.id::text, 18),
                                  '-', ''))::bit(64)::BIGINT);

is to get an exclusive lock on the data for a given id. This is necessary to make sure that updates from multiple services get serialized on the same data. This scheme can't ensure that there are merged changes - only that there is a sequence of changes to the table, and each one is entirely correct for the time it was entered.

So... what happens if you have a string as the primary key, and not a UUID? Well, use can use the MD5 checksum of the string as the lock indicator:

    -- get the advisory lock on a general string
    perform pg_advisory_xact_lock(('x' || md5(NEW.wiggle::VARCHAR))::bit(64)::BIGINT);

where the field wiggle is a varchar, and here, we are just computing the MD5, and using that as the basis of the lock. Yes, there could be some hash collisions, but that's likely not a huge performance problem, and it's conservative in that we'll over-lock, and not under-lock.

UPDATE: a friend asked about using an int as the primary key, and in that case, the line would be:

    -- get the advisory lock on a general string
    perform pg_advisory_xact_lock(NEW.id::BIGINT);

where the column id is an int. Again, we just need to get it to the bigint for the advisory lock call. After that, Postgres does the rest.

Happy Birthday to Me!

December 31st, 2018

Cake.jpg

It's the big 57 this year, and I honestly don't feel it, but then again, not many probably do. I am glad to have had the last week off work - The Shop has shut down for the week between Christmas and New Year's, and I got lucky with the draw.

I have been really enjoying all the Advent of Code puzzles from past years, and that has really made the time pass nicely. Just keep living in this moment.

Another Christmas in the Books

December 25th, 2018

Christmas Tree

It's been a very quiet Christmas this year - and while I don't necessarily think that's the very best way to spend the holiday, it beats some of the alternatives, and that's what I have to remember... some of the alternatives.

There may come a day when I feel differently, but today, I'm happy enough that it's been a quiet day.

Having Fun with Advent of Code

December 11th, 2018

Christmas Tree

It's Day 11 of Advent of Code, and I have to say that I'm having a great time this year. It helps to have a little spare time as they are working on the rearrangement of the groups with the acquisition, so I've got a little spare time most days.

I honestly expected to be using Swift, Clojure, ObjC - all mixed in there. This was the advantage of getting CodeRunner a little bit ago - that I could mix-n-match language to the problem based on the needs of each problem. But I have to say that I'm enjoying using Clojure 100% so far this year. It's just so good at what I need it to do... it's hard to find a need for another language.

I've also really enjoyed the times when the solution for the first part of the day is fine with my original implementation, but then the second part requires that I really re-think the solution and come up with a much more performant solution. Not always easy, but when I get something that really changes the scope of what can be done, it's a lot of fun.

One of the guys at The Shop pointed out that I'm doing the problems like Programming Golf - where the minimal solution is the best. And that's exactly what I enjoy about these - it's about a minimalist approach to the problems. What fun. 🙂

AWS Adds ARM Instances to EC2

November 28th, 2018

Amazon EC2 Hosting

I was surprised to read that at it's yearly conference, Amazon announced that you can now spin up EC2 instances based on their custom ARM CPU. This isn't a complete surprise - face it, Apple is close to launching ARM-based laptops and desktops. It's been batted about in the press for a while, and based on the old quad-fat binaries, the technology is there, and Apple certainly has all the experience to get macOS up and running on ARM.

This isn't necessarily the cheapest EC2 instances - for the a1.medium, a 1 CPU, 2 GiB RAM instance, is $0.0255/hr which rolls up to $233.38/yr for the instance. And the t3.nano starts at $0.0052/hr, but what's most interesting is that AWS did the math, and decided that building their own CPU - and then, of course, their own machines, was the cost-effective way to go. Amazing.

I have to believe that Intel is missing out - or maybe they will be tied to the x86 chipset and ride that for all it's worth. Who knows... but it seems like something they are missing out on. And how long can it be before we see laptops and desktops based on ARM? Not long.

SubEthaEdit 5 is Open Source

November 28th, 2018

subethaedit.jpg

This morning I saw a tweet from SubEthaEdit that they were Open Sourcing the editor - and the current version, SubEthaEdit 5, was still on the Mac App Store, and would be free. This was a real surprise to me. I've paid for several of the versions of this collaborative editor on the Mac - heck, I've written syntax highlighting definition files for Make and Fortran for this editor. It's a big part of my tools in the past.

I have worked with my good friend on Macs for many years, and when this first appeared, as Hydra, I thought that this would be a great tool for working on code with him. But it was commercial, and we were in different states, and we hadn't even started using Git - and GitHub wasn't even an idea at the time. So it just fizzled out.

But at several times in the last 5 years we've both talked about getting something like this going for remote pair coding. It's just an editor, and he's now using Cursive for his Clojure coding, so again, maybe it's not such a great fit... and there are other services that are going for an add-in mode for existing editors, so maybe it needs to be updated to really find it's market. If so, I think that would be great.

I hope it finds a great group of developers now that it's Open Source. I'd love to have a good tool that's really written to handle the collaborative editing from the jump. Then again, I'm not all that sure what we'd need above GitHub... but it's an admirable goal.

Paw is a Great REST API Tool

November 28th, 2018

Paw

This morning I noticed that Paw 3.1.8 was released, so I updated right away - it's about the best tool I've ever used for testing and exercising REST APIs on any platform, and on the mac, it's just gorgeous. This is a tool that I used constantly for years when working on Clojure REST services. It allowed me to have variables for each call, and then to group them into environments so that it was easy to switch from development to local to production and see the different responses - where the variables would include the name of the host, etc.

Paw 3 1

Postman is nice - and it's got a lot of features, but it isn't a native Mac app, and it's tied to the UI and workflow of a web app - which is fine, and I've used Postman a lot, but when I started using, and configuring, Paw, it wasn't even close. This is how Mac apps - tools - should be written, and sure, it's not cheap, but good things rarely are.

I still smile when I pull up the sets of calls, and how easy it was to load up a problem request, fire it off, document what was happening, and then see it in the logs... well... this was one of the tools that really made that job a dream.

Another Thanksgiving in the Books

November 23rd, 2018

Thanksgiving

Well... it was a lot of driving, but I'm home, and enjoying the long weekend where I can catch up with work around the house, and rest for the week ahead. It was nice to see my family - all my siblings were there - save my oldest sister, and it was good food... I brought the pies, as usual... but it's also nice to be home.

I find that I'm reaching a point in life that I get about all the company I need with these few trips home at the holidays. I have lots of company at work, and on the train, so it's not like I don't see people. I just find that I can be cheerful and pleasant, but I don't have to have deep conversations at this point in my life.

I'm making the best of the path that I'm on.