Archive for December, 2016

Happy Birthday to Me

Saturday, December 31st, 2016

Cake.jpg

Well... today is my birthday, and as a little (unexpected) present to myself I got a new washer and dryer combo at Lowe's yesterday after the dryer that came with the house gave up the ghost. It's been a great pair - but about three months ago, the kids overheated the washing machine motor, and then the dryer gave it up yesterday, so I took it as a sign.

Time to knuckle-down and just get a new set.

I have to admit, I'm a fan of the gas dryer - having grown up with electric dryers until I got to college, and then at the grad house they had gas, and I've never looked back. There's just something about the feel of the heat when you open the dryer and it's just done. It's very comforting to me.

One thing I'm kinda surprised that they haven't done yet is to have these devices on WiFi and messaging an app on your phone. I mean think about it - it can't be all that hard, and they put the electronics on these guys to do 57 different cycles and all that, so how hard would it be to text or message a phone?

And like my car, the manufacturer could make one chip/app combo for all their models... it just seems like something for the Uber/Swipe-Right generation.

In any case, it's 55 years today, and I hope that after the delivery this morning my day is nicely calm and uneventful.

For the Love of Spock – Wonderful

Sunday, December 25th, 2016

TV.jpg

I just finished watching For the Love of SPOCK on Netflix, and it was amazing. The people Leonard Nimoy worked with all in the film... the memories, both good and bad, were wonderful and full of love for this man, and the character he created. I grew up on Star Trek, so I could quote you every line they showed clips from. I knew of the failed pilot that became the two-part show with Capt. Pike... all the lines. All the smiles.

And then to see the actors talk about him. It was impressive.

If you have any interest in complicated relationships, and hope that one day they will repair themselves, even through tragedy, this is a story you want to spend two hours on. It's worth it.

Merry Christmas!

Sunday, December 25th, 2016

Christmas Tree

It's that most wonderful day of the year. This morning I got up, watched a little TV, and went to church - the service was an hour later than normal because it was going to be just the one service for the morning, but that's OK. I'm flexible. And during the service, I remembered something the Senior Pastor had said several years before - on my first Christmas without my family. He had said "Be conscious of who is around you today - realize that today is the worst day someone's ever had. And it's also the best day someone else has ever had."

This morning, as I was texting all my family and friends, I realized what my recovery meant to me. It's that both those days - the best and the worst - were happening every day for me. There wasn't a day that I live now that isn't plagued with what I've lost. How my wife and kids simply can't stand me - and yet they will not tell me why.

"I never really loved you... You were just a better alternative than moving back home [after college]" - that was all I was ever going to get. The kids even less.

But at the same time, I still loved them, and wished them well.

I took presents over to my two youngest this week, and they took them and shut the door. I had asked them if there was anything they wanted - no response. I text them weekly, Sunday mornings, in fact, and nothing.

Yet today is also the very best day of my life. I could get up and go to this wonderful church. I could feel that I'm actually worth something. I could sit and create systems that make me feel the beauty and grace of all living things is in front of me when I do this - and it is a joy like none I've ever felt.

And it always has been.

So today I've realized that my life now is that every day is both the best day I've ever had, and the worst day I've ever had. It's not that it makes anything better, but to me, it explains so much. Why I can be happy about what I'm creating at work, and be so close to tears about what is not - and never likely will be again.

To understand our situation is to take the first steps of coming to terms with it - making peace with it. I hope that comes... but I'm not expecting it. Understanding is good enough for today.

Merry Christmas!

Netflix is Creating Amazing Content

Saturday, December 24th, 2016

TV.jpgI'm watching Spectral on Netflix is not getting great reviews, but I think it's pretty slick. I'm not a big fan of the war movie genre, but this is a pretty decent take on that, and while I took exception to the main character being an engineer for DARPA - and carrying two packs (no engineer does that) - it's still not bad. He's not a bad character.

But there's also Medici, and other productions. Pretty impressive. But I don't want to see them stop showing the oldies - that's part of the reason for having access to Netflix.

Gotta say, this is impressive stuff, though.

Making an Effort

Saturday, December 24th, 2016

PathI was sitting here today, and realized that I could make more of an effort to keep the journal up to date - even if it's just nice code I've done, or found, and a few little things thrown in as they come up. Life isn't going to just get better... it's going to be just like it is. For a very long time.

But that doesn't mean that it can't also be a time to get things done.

So I want to make more of an effort to write. About something. I don't know if it'll change anything, but if there's even a 0.01% chance, it's worth putting in the effort.

Unzipping a File in S3 to S3

Saturday, December 3rd, 2016

Amazon EC2 HostingIn the work I'm doing, I've got a service that chooses to return a group of files in a standard Zip file format, and then I can easily store it in S3 using amazonica and a little bit of code:

  (defn save!
    "Given a byte array or string `data` and a bucket `b`, saves the
    bytes as an encrypted file with file name `fname`. Optional keys
    `content-type` and `overwrite` may also be passed, where the
    `:content-type` key indicates the content type (defaulted to S3's
    inference), and the `:overwrite` key indicates whether to replace\
    the file if it already exists (defauted to false)."
    [data b fname & {:keys [content-type overwrite input-stream-size]}]
    (if (and (or (string? data) (byte-array? data) (instance? InputStream data)) b)
      (let [nm (or fname (uuid))
            echo (if-not fname {:name nm})
            [inp cnt] (cond
                        (string? data)     [(ByteArrayInputStream. (.getBytes data))
                                            (count data)]
                        (byte-array? data) [(ByteArrayInputStream. data)
                                            (count data)]
                        (instance? InputStream data) [data input-stream-size]
                        :else [nil nil])]
        (if (or overwrite (false? (file-exists? b nm)))
          (try-amzn 3
            (merge
              echo
              (put-object (get-cred)
                          :bucket-name b
                          :key nm
                          :input-stream inp
                          :metadata (merge
                                      (if content-type {:content-type content-type})
                                      (if cnt {:content-length cnt})
                                      {:server-side-encryption "AES256"}))))
          (merge echo {:error "FileNotSaved"})))))

But what if it's a zip file? If we want to do this one-pass, we have to load the entire contents of the file into memory, and then piece it apart. That's certainly possible, but what if the files are very large? Why not unzip the stream, and write it back to S3 as a stream?

Then, we don't have to have a large memory footprint in order to process the large zip files. That would be nice.

  (defn unzip!
    "Function to unzip the provided file in the provided bucket into the same
    S3 bucket ehere the directory is the name of the file - without the extension,
    and all the files from the zip file are deposited into the directory under
    their names in the zip archive. The downside of this function is that it has
    to read the zip file from S3 'n' times - one for each of the files in the
    zip archive. That means that it's not at all fast. This returns a sequence
    of all the files that have been unzipped to S3:
      [\"13B7E73B053C497D82F8FCC28FC8127F/13b7e73b053c497d82f8fcc28fc8127f.XML\"
       \"13B7E73B053C497D82F8FCC28FC8127F/Form0001.PDF\"
       \"13B7E73B053C497D82F8FCC28FC8127F/Index.ctl\"]
    "
    [bkt fname]
    (if (file-exists? bkt fname)
      (if-let [base (some identity
                          (rest (re-matches #"(?i)(.*)\.zip$|(.*)\.xfr$" fname)))]
        (let [afn (atom [])
              push (fn [ze zis]
                     (if (not (.isDirectory ze))
                       (let [to-file (str base "/" (.getName ze))
                             res (save! zis bkt to-file
                                        :overwrite true
                                        :input-stream-size (.getSize ze))]
                         (if-not (:error res) (swap! afn conj to-file)))))
              s (get-stream bkt fname)]
          (with-open [z (ZipInputStream. s)]
            (doseq [e (entries z)
                    :let [en (.getName e)
                          ms (get-stream bkt fname)]]
              (with-open [mis (ZipInputStream. ms)]
                (let [me (entries mis)]
                  (push (first (drop-while #(not= en (.getName %)) me)) mis))))
            @afn)))))

The result is something that has to read the stream once for each file in the unzipped file, but then it can write each of these back to S3. It's not super network efficient, but the existing library closes the stream when it's done reading a file, and if they just hadn't done that, then I could have done it all in one pass.

Still... this is nice. It works great, and does just what I need.