Archive for the ‘Vendors’ Category

Flickr is Changing the Free Service

Monday, November 5th, 2018

flickr.jpg

This morning I read that as of February 2019, Flickr's free service will not be 1TB of storage, as it has been - but only 1000 images. The change, they say, is focused to make it more in line with photographers:

First, and most crucially, the free terabyte largely attracted members who were drawn by the free storage, not by engagement with other lovers of photography. This caused a significant tonal shift in our platform, away from the community interaction and exploration of shared interests that makes Flickr the best shared home for photographers in the world. We know those of you who value a vibrant community didn’t like this shift, and with this change we’re re-committing Flickr to focus on fostering this interaction.

Now as someone who uses Flickr for both pictures and for blog posts with screen shots, I can understand that storage isn't free, and 1TB per free user is probably a cost the new company couldn't bear - and they knew it from Day 1. This is how they get back in the black.

The only alternative is Pro, at $49.99/yr and that's unlimited storage. So there's not going to be a low-cost version where the old 1TB limit would be. Nope. It's go big - or not. Maybe 1000 photos is enough, but I'd really like it if we could use iCloud storage to vend pictures out of, but I know why that's not possible. So we'll have to see how things pan out,

It's not like I can't figure something else out for my blog.

Finished an Online Course

Monday, November 5th, 2018

cubeLifeView.gif

This is interesting... I just finished an online course about Data Science, covered by The Shop, in an effort to be able to reach across the divide that currently exists between the science group and the engineering group. It doesn't need to exist, but it's there, and I was hoping that by taking this course, I'd be seen as trying to reach out. Maybe help things a little.

The class was meant to be 5 weeks, and from the sound of it, it was going to be mentored by some folks here in the science group. Again, sounds like just what I want - bonding experiences in class, and all that. Good. But when I signed up for the class, it was clear that it was offered from a larger institution and it wasn't really mentored by folks here - as much as we would have 1 hr meetings each week about the content of the course for that week.

So not at all what I was hoping for. But I couldn't really get upset about the course - it was exactly what it said it was, I had just assumed facts without checking them first. That's all on me.

The course was focused on understanding the basics of Data Science work, installing and running R and RStudio. Working with Git and GitHub, and a few shell commands. Not bad - given that each week of work was about 25-30 mins of videos to watch. That's not a lot if you want to teach someone shell commands. So it's not bad.

But it got me thinking about a real Data Science class for The Shop. These developers all understand math, calculus, all that... and they know the tools... so what about really teaching them something? That would be something to sit in on. So I sent it to my group just as a "This would be nice..." thought.

I guess this will be my first grade after my PhD, which is in a way, very funny to me, but it's done, and now it's time to see what'ss next.

Updating the JDK Versions

Thursday, October 11th, 2018

java-logo-thumb.png

This morning I noticed that Clojure 1.10.0-RC1 was out, and that it was setting a minimum of JDK 1.8 as the required Java version. That's not a problem, as that's what I've been using, but I thought it might be a good thing to get the lates versions of the JDK installed on my box, and make sure that JDK 10 and JDK 11 work with the setjdk Bash function I wrote to make it easy to change from version to version in the environment.

So I went to the java.com web site, and first noticed that the current version they are suggesting is JDK 1.8 - and that really surprised me. I have known that JDK 9, 10, and 11 have been out for a bit, but to see that Oracle is suggesting that a new user install JDK 1.8.0_181 - that's just a little surprising.

Also, they made it a lot harder to find the other versions of the JDK - maybe that's because there was more confusion than necessary for folks just looking for the latest JDK - but therein is kinda my concerned - JDK 1.8. Still... I found it and was able to get the latest JDK 1.8.0_181, JDK 10.0.2, and JDK 11 - they have clearly decided to change the versioning, which is OK with me - but it means that I really need to make sure that I check the setjdk function when I install these guys.

When I got them installed, they all looked in place:

  peabody{drbob}516: ls -lsa /Library/Java/JavaVirtualMachines/
  total 0
  0 drwxr-xr-x  13 root  wheel  416 Oct 11 10:29 ./
  0 drwxr-xr-x   5 root  wheel  160 Sep 24 19:00 ../
  0 drwxr-xr-x   3 root  wheel   96 Jun 29  2011 1.6.0_26-b03-383.jdk/
  0 drwxr-xr-x   3 root  wheel   96 Oct 11 10:29 jdk-10.0.2.jdk/
  0 drwxr-xr-x   3 root  wheel   96 Oct 11 10:29 jdk-11.jdk/
  0 drwxr-xr-x   3 root  wheel   96 Feb  5  2013 jdk1.7.0_13.jdk/
  0 drwxr-xr-x   3 root  wheel   96 Oct 16  2013 jdk1.7.0_45.jdk/
  0 drwxr-xr-x   3 root  wheel   96 Jan 17  2014 jdk1.7.0_51.jdk/
  0 drwxr-xr-x   3 root  wheel   96 Mar 20  2015 jdk1.7.0_75.jdk/
  0 drwxr-xr-x   3 root  wheel   96 May  1  2017 jdk1.8.0_131.jdk/
  0 drwxr-xr-x   3 root  wheel   96 Oct  2  2017 jdk1.8.0_144.jdk/
  0 drwxr-xr-x   3 root  wheel   96 Oct 11 10:28 jdk1.8.0_181.jdk/
  0 drwxr-xr-x   3 root  wheel   96 Mar 20  2015 jdk1.8.0_40.jdk/

and then a quick check of the setjdk script:

  peabody{drbob}504: setjdk 10
  peabody{drbob}505: echo $JAVA_HOME
  /Library/Java/JavaVirtualMachines/jdk-10.0.2.jdk/Contents/Home
  peabody{drbob}506: setjdk 11
  peabody{drbob}507: echo $JAVA_HOME
  /Library/Java/JavaVirtualMachines/jdk-11.jdk/Contents/Home
  peabody{drbob}508: setjdk 1.8
  peabody{drbob}509: echo $JAVA_HOME
  /Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home

This is the current cut of my setjdk function in my ~/.bashrc file:

  #
  # Clever trick to leverage the /usr/bin/java commands to take advantage
  # of the JAVA_HOME environment variable and the /usr/libexec/java_home
  # executable to change the JDK on-the-fly. This is so easy I'm amazed.
  #
  function removeFromPath() {
    export PATH=$(echo $PATH | sed -E -e "s;:$1;;" -e "s;$1:?;;")
  }
 
  function setjdk() {
    if [ $# -ne 0 ]; then
      removeFromPath '/System/Library/Frameworks/JavaVM.framework/Home/bin'
      if [ -n "${JAVA_HOME+x}" ]; then
        removeFromPath $JAVA_HOME
      fi
      export JAVA_HOME=`/usr/libexec/java_home -v $@`
    fi
  }
  setjdk 1.8

So now I can work with any version of Java out there. What I did find interesting is that they have pulled JDK 9 - and that means it was really bad. Like Wow bad... at least they knew to pull it.

Datadog Gauges in Clojure

Tuesday, August 18th, 2015

Datadog

The Shop is big into Datadog, and it's not a bad metrics collection tool, which I've used very successfully from clojure. Based on the use of the local Datadog Agent (freely available from Datadog) and how easily it's placed on linux hosts - AWS or otherwise, it's a clear win for collecting metrics from your code and shipping them to a nice graphing/alerting platform like Datadog.

The code I've set up for this is pretty simple, and based on the com.codahale.metrics java libraries. With a simple inclusion into your project.clj file:

  [io.dropwizard.metrics/metrics-core "3.1.0"]
  [org.coursera/dropwizard-metrics-datadog "1.0.2"]

you can then write a very nice metrics namespace:

  (ns ns-toolkit.metrics
    "This is the code that handles the metrics and events through the Dropwizard
    Metrics core library, which, in turn, will ship it over UDP to the DataDog
    Agent running on localhost."
    (:require [clojure.tools.logging :refer [infof debugf warnf errorf]])
    (:import [com.codahale.metrics MetricRegistry]
             [org.coursera.metrics.datadog DatadogReporter]
             [org.coursera.metrics.datadog.transport UdpTransportFactory
                                                     UdpTransport]
             [java.util.concurrent TimeUnit]))
 
  ;; Create a simple MetricRegistry - but make it only when it's needed
  (defonce def-registry
    (delay
      (let [reg (MetricRegistry.)
            udp (.build (UdpTransportFactory.))
            rpt (-> (DatadogReporter/forRegistry reg)
                  (.withTransport udp)
                  (.withHost "localhost")
                  (.convertDurationsTo TimeUnit/MILLISECONDS)
                  (.convertRatesTo TimeUnit/SECONDS)
                  (.build))]
        (.start rpt 5 TimeUnit/SECONDS)
        reg)))
 
  ;; Somewhat faking java.jdbc's original *connection* behavior so that
  ;; we don't have to pass one around.
  (def ^:dynamic *registry* nil)
 
  (defn registry
    "Function to return either the externally provided MetricRegistry, or the
    default one that's constructed when it's needed, above. This allows the user
    the flexibility to live with the default - or make one just for their needs."
    []
    (or *registry* @def-registry))

And then we can define the simple instrumentation types from this:

  ;;
  ;; Functions to create/locate the different Metrics instruments available
  ;;
 
  (defn meter
    "Function to return a Meter for the registry with the provided tag
    (a String)."
    [tag]
    (if (string? tag)
      (.meter (registry) tag)))
 
  (defn counter
    "Function to return a Counter for the registry with the provided tag
    (a String)."
    [tag]
    (if (string? tag)
      (.counter (registry) tag)))
 
  (defn histogram
    "Function to return a Histogram for the registry with the provided tag
    (a String)."
    [tag]
    (if (string? tag)
      (.histogram (registry) tag)))
 
  (defn timer
    "Function to return a Timer for the registry with the provided tag
    (a String)."
    [tag]
    (if (string? tag)
      (.timer (registry) tag)))

These can then be held in maps or used for any reason at all. They automatically send their data to the local Datadog Agent over UDP so there's no delay to the logger, and since it's on the same box, the likelihood that something will be dropped is very small. It's a wonderful scheme.

But one of the things that's not covered in these metrics is the Gauge. And there's a really good reason for that - the Gauge for Datadog is something that is read from the Datadog Agent, and so has to be held onto by the code so that subsequent calls can be made against it for it's value.

In it's simplest form, the Gauge is just a value that's read by the agent on some interval and sent to the Datadog service. This callback functionality is done with a simple anonymous inner class in Java, but that's hard to do in clojure - or is it?

With Clojure 1.6, we have something that makes this quite easy - reify. If we simply add an import:

  (:import [com.codahale.metrics Gauge])

and then we can write the code to create an instance of Gauge with a custom getValue() method where we can put any clojure code in there we want. Like:

  ;;
  ;; Java functions for the Metrics library (DataDog) so that we can
  ;; constantly monitor the breakdown of the active docs in the system
  ;; by these functions.
  ;;
  (defn cnt-status
    "Function that takes a status value and finds the count of loans
    in the `laggy-counts` response that has that status. This is used
    in all the metrics findings - as it's the exact same code - just
    different status values."
    [s]
    (reify
      Gauge
      (getValue [this]
        (let [sm (first (filter #(= s (:status %)) (laggy-counts)))]
          (parse-int (:count sm))))))
 
  (defn register-breakdown
    "Function to register all the breakdowns of the loan status counts
    with the local Datadog agent to be sent to Datadog for plotting. This
    is a little interesting because Datadog will call *these* functions
    as needed to get the data to send, and we will control the load by
    using memoized functions."
    []
    (.register (met/registry)
      "trident.loan_breakdown.unset"
      (cnt-status nil))
    (.register (met/registry)
      "trident.loan_breakdown.submit_to_agent"
      (cnt-status "Submit to Agent"))
    (.register (met/registry)
      "trident.loan_breakdown.submit_to_lender"
      (cnt-status "Submit to Lender"))
    (.register (met/registry)
      "trident.loan_breakdown.submit_to_lender_approved"
      (cnt-status "Submit to Lender - Agent Approved"))
    (.register (met/registry)
      "trident.loan_breakdown.lender_approved"
      (cnt-status "Lender Approved")))

What I like about this is that I can allow the Datadog Agent to hit this code as often as it wants, and don't have to worry about the freshness of the data - or an excessive loan on the server resources for being hit too much. I can simply memoize the functions I'm using and then control the load on my end. It's very clean, and very nice.

Built a Nice Simple Clojure/Bootstrap/Google App

Tuesday, June 2nd, 2015

Clojure.jpg

Over the course of the last day and a half, I've been putting together a nice little test app using clojure and compojure on the back-end, and Bootstrap with Handsontable as the UI. It's simple, and not all that complicated, but it's nice in that it demonstrates a lot of the things that I often want to do in a simple application.

Then today one of the guys I was putting this together for suggested OAuth2 for authentication and then something internal for authorization. This turned out to be not all that bad with Google's Identity Service. They have a nice little JavaScript client that gets the auth token that you then feed to the calls to the service, and it, in turn, hits Google to make sure that this is a person that they say they are. You end up with an email address, and that can be used as the "username" for any authorization scheme you want.

In short, it's very slick. Works like a dream. I'm having a pretty good time with this.

better colors

Twitter Ad Service API

Thursday, May 14th, 2015

Twitterrific.jpg

Today I spent a good bit of the day trying to figure out how to authenticate with Twitter's OAuth 1.0 system, and I think I'm getting close, but I'm still a bit away because I don't control these accounts, and the sheer volume of ways to authenticate on Twitter is daunting. Let allne the different APIs.

There is the client-facing Tweets API, and then there's the Ad Server API, and it's not at all clear that there needs to be different authentication schemes for these APIs. But it should be clear that access to one set of APIs probably should not guarantee access to another set - and maybe they handle that in the authorization, but it's not clear from the docs I'm reading.

And speaking of docs, wow... these are really something else. There are at least four ways to authenticate, but they ask people to use libraries - that they don't provide. Sadly, I don't see one that does 100% of what I need, but I do see an OAuth 1.0 library, but the Client ID and Secret are nowhere to be found on their site.

So clearly, I'm missing something.

What I believe is that you have to create an App that then gets you the redirect URL and ID and Secret. There were none defined to base a new one on, and so I sent off an email to the Twitter representative to see if this was, indeed the preferred way.

While I was waiting, I decided to try and make an app. Yet in order to do that, you need to assign a mobile phone number to the Twitter account, and I can't really do that because the account is not mine. SO I sent another email to the relationship folks in The Shop about that.

In short, it's just a waiting game. But it's also so much more of a mess than the other systems I've been integrating with. Wow...

Heroku Adds Redis

Tuesday, May 12th, 2015

Heroku

This afternoon I saw a tweet from Heroku about them adding Redis to the add-ons for their service. This just a few days after their announcement that Postgres was available for the free tier, and the new "Free" tier for apps. They are getting aggressive with what services they are providing. This makes a ton of good sense to me, as I'm a huge fan of redis from clojure, and this makes all the experience I've got in building apps directly transferable.

While I know the Free tier isn't all that great, the idea that there is a Free tier is amazing, and it means that I can write something and throw it up there, and as it's needed, I can scale it up. Very cool. They also have a hobbyist tier that's only something like $8/mo. - similar to GitHub.

If I needed to be firing up a web service, it'd be clojure, redis, and postgres - all on Heroku. What an amazing service.

LinkedIn API for a Recruiting Tool

Thursday, May 7th, 2015

LinkedIn

I was asked today to look at the LinkedIn API to see if I could access the data at LinkedIn to make an advanced recruiting tool for the Recruiters here at The Shop. The idea was to take a resume we received, match it to a LinkedIn profile (I'd venture that 90%+ of them are there), and then use advanced analytics to rate the prospective resumes for potential success at this job.

It's an interesting idea. The real advantage for LinkedIn is that companies like ours pays several thousand dollars a month to have access. With this kind of tool, that same data could be used to classify candidates by the data they have entered, and then use a nice predictive model to say who is most likely to succeed. It's simple feedback.

We take everyone that's been successful at this company. Reference their LinkedIn profiles for the training data, and then use any and all reviews to say which of these people are likely to be the successful ones - based on all the classified data that's on LinkedIn.

It's kinda neat. We don't have to wonder what factor(s) matter most - we can get all that LinkedIn has in their API, and then use the success factor - say a 1 - 5 rating as the outcome, and then train away. After that, every submitted resume can run through the trained net, and come up with a score and a confidence number. Pretty simple.

It's not meant to be fool-proof, but when you have a ton of openings, it's nice to be able to have something that whittles down the list of thousands to hundreds, or less - so that you can really focus on these people.

We'll see where it goes - if it goes anywhere.

Dug a Little on Interactive Brokers

Thursday, May 7th, 2015

WallSt.jpg

A friend of mine has asked me to look into the Interactive Brokers offerings as they have an API for trading and he's interested in moving off the Windows platform he's on now (.NET) and have me help him scale up his trading strategy quite a bit. So I started looking at what they have.

First off, they are cross-platform, as well as web-based. That's nice. Their API access has virtually everything you need to get market data as well as execute and monitor trades, as well as do all the portfolio management and reporting. Very nicely done. It's also all on GitHub, and they are willing to look at pull-requests on the API. This is really nice for a lot of reasons, but the most significant to me today is that they don't see this as a complete control situation. This tells me a lot.

Honestly, I was getting kinda jazzed about it because I remember all this stuff from Finance. I can see making a process that runs alongside the Trader Workbench - their flagship product and register for ticks and trades and then calculate everything it needs and submit trades as needed. These could all be viewed in the TWS screen as the account would have to be the same. It's like having an automated trader working for you and all you have to do is watch it work.

I read a book on their Java API, and dug into the API Docs as well as reading on the relationship between the IB API and TWS. It's not a bad system. Given that this is not a high-frequency strategy, there's no need for co-location, and the associated costs. It's something that hopefully can run on a MacBook Pro - or if not, then a Mac Pro, and a nice, fast cable modem.

I looked it up, and Comcast can go to 105 Mbps in my area, and I'm at 50 Mbps already. The package for 105 Mbps is pretty decent and about a wash with what I'm paying now for 50 Mbps. I'd have to call and verify and make sure the cost delta is what the web site says, but if so, then it's very doable.

It might be really nice to be back in finance. Start-up again. Holy Cow! What an idea. 🙂

Interesting Problem with Google AdWords API

Wednesday, April 15th, 2015

AdWords

I was working on stripping out a little bit of code from a project at The Shop today, and when I stripped out the library I had made, I got the following error when trying to start the REPL:

  $ lein repl
  Exception in thread "main" java.lang.NoClassDefFoundError:
  clojure/tools/logging/impl/LoggerFactory, compiling:
  (/private/var/folders/ct/jhkds06j26v1lq2t40jx4cndl_629q/T/
  form-init4416651774354867948.clj:1:124)
      at clojure.lang.Compiler.load(Compiler.java:7142)
      at clojure.lang.Compiler.loadFile(Compiler.java:7086)
      at clojure.main$load_script.invoke(main.clj:274)
      at clojure.main$init_opt.invoke(main.clj:279)
      at clojure.main$initialize.invoke(main.clj:307)
      at clojure.main$null_opt.invoke(main.clj:342)
      at clojure.main$main.doInvoke(main.clj:420)

And if I put the library in the project.clj, I don't get this error, but if I take it out, I get this error. And I needed to take it out.

When I did a lien reps :tree to see what overlaps there might be in the libraries, I found a few in the Google AdWords libraries. Easily enough, I took them out, and forced the right version with a simple:

  ;; AdWords Java API
  [commons-lang "2.6"]
  [com.google.api-ads/ads-lib "1.38.0" :exclusions [commons-lang
                                                    org.slf4j/slf4j-api]]
  [com.google.api-ads/adwords-axis "1.38.0" :exclusions [commons-lang
                                                         org.slf4j/slf4j-api]]

because the conflict was in the Google jars and both 2.5 and 2.6 of commons-lang were being used. Simply exclude them both, reference 2.6 first, and that should take care of it.

But it didn't.

So I took to the Google and found that someone had solved this by putting the class in the :apt section of the project.clj file. So with that, I tried:

  :aot [clojure.tools.logging.impl bartender.main]

and then things started working just fine again.

I'm guessing that by building the uberjar for the other library, it did this compilation, so that it wasn't necessary for this project. Take it out, and we have a problem. Force the compile first, and it's all good.

Glad I figured that out.