Today, with the update of Xcode 10.2, and Swift 5.0, I had to struggle with the formatting of strings in order to find the pattern they represented. The point is really this: Represent the pattern of the characters in a word so that 'bee' and 'too' and 'see' all look like the same pattern.. This is used in my CryptoQuip solver, and the point is to group words by their patterns because we don't know what the actual characters are, but we do know their patterns - because it's a simple substitution cypher.
So how to do that? Well... when we look at Clojure - which just deals with the string as a sequence of characters - just data, we have:
(defn pattern
"Function to take a word (as a string) and return a vector that is the
pattern of that word where the values are the index of the character.
=> (pattern \"see\")
(0 1 1)
=> (pattern \"rabbit\")
(0 1 2 2 4 5)
"
[word]
(map #(.indexOf word (int %)) word))
and if, for the sake of this post, we drop the comments, we get something very simple:
(defn pattern
[word]
(map #(.indexOf word (int %)) word))
if we look at this in Swift, we see:
extension String {
/**
Attributes of a string that return a string that is the pattern of that
word where the values are the index of the character. This is a simple
baseline pattern generator for the words so they are comparable.
```
=> "see".pattern
"abb"
=> "rabbit".pattern
"abccef"
```
*/
var pattern: String {
get {
let ascii: [Character] = ["a","b","c","d","e","f","g","h","i","j","k",
"l","m","n","o","p","q","r","s","t","u","v",
"w","x","y","z"]
var ans = ""
let src = Array(self.utf8)
for c in src {
ans.append(ascii[src.firstIndex(of: c)!])
}
return ans
}
}
}
And the reason to use a String as opposed to a sequence of numbers in Swift, is that those comparisons are not nearly as nice in Swift as simple String comparisons are. But I look at all this, and while the protocols in Swift are nice - to add methods to existing classes - it's much the same in Ruby, and it can lead to some very tough bugs - and so you have to be very careful using them.
And this got me to thinking about the complexity of most systems and the OOA&D systems I've seen in C++, Java, Ruby, ObjC, Swift - and it really is hard to come up with a really great design if you don't put in a ton of effort on the work. Sure... Boost for C++, and Java classes, are good designs - but they had a lot of backing and time to get right. ObjC - specifically Foundation, is well-done, but again, that was a serious investment by NeXT. But most of the non-OS-level projects... like those in the wild, they are a mess.
I don't think this is an accident. I think good, solid, OOA&D is hard because there are so many times when a method isn't clearly belonging to one object, or another - and the language might not be set up to have stand-alone functions as an alternative - Java, Scala. So they have to go somewhere, and that means that things get tossed into the closest reasonable object - as long as it's "close enough". But then 6 months later, it's a disaster. No one remembers why each method is on these objects... and the circular references require interfaces, and then implementations of those interfaces... and it just gets to be a mess.
What I believe is that the simpler the code, the better. This means more abstract. More critical thinking, and less "let's just hammer this out" work. I'm sure there are folks that can do a good job on an OOA&D project - and it could be massive and complex... but those people are rare - very rare. And in general, you end up with really bad objects that create horrible inclusion requirements, and even worse threading issues - because it all mutable, and you just have systems that can't get larger than a certain size.
That's not good. Math doesn't work that way. Neither should coding.