When posting benchmarks comparisons, it’s pretty important to ensure that you’re testing the same amount of work. For example, if you’re comparing the ability to construct objects and index them, you should make sure that you’re generating the same indexes. Similarly, when you’re comparing languages and are benchmarking the costs of things, you really should make sure that your programs are doing the same amount of work.
For example, consider Serge Hulne’s posting to the Vala newsgroup, whereby he shows that Go is 2x slower than Vala. He posts programs in both languages that calculate the 10 most used words in the entire works of William Shakespeare. His algorithm is not the most efficient way to perform this task, but that’s not the point; his point was to exercise aspects of both languages: object creation, array and associative array performance, IO; that sort of thing. The problem is that his Go version did a lot more unnecessary work than his Vala version; whether he did this on purpose to make Vala look better, or if he just accidentally left in some debugging code, I don’t know… but if you’re going to make assertions about performance benchmarks and you don’t make sure that you’re being as fair as reasonably possible given your skill level, then you just end up making yourself look silly.
In Serge’s case, his Go program processed the text as Unicode, whereas his Vala program processed it as raw bytes. There was no excuse for this; it was a trivial change to make the Go version process bytes, too. That smells a lot like an intentional cheat. Then, too, the Go version did extra work that the Vala version didn’t, like calculating the number of lines, words, and characters in the file – although it never did anything with the values except sum them up. That’s 5M (chars) + 900k (words) + 124k (lines) additional addition operations for the Go version. Finally, there was an unnecessary if-else-clause in a loop in the Go version that was being executed for every word, so 900k more branch operations (and branches are relatively expensive, as far as CPU operations go).
Getting rid of these things that the Go code was doing that the Vala code wasn’t improved the speed of the Go program by 30% (0.78s vs 1.1s).
Go isn’t going to be as fast as Vala, at least not in the immediate future; Vala does a pretty good job in it’s translation to C, and GCC is pretty scary in it’s ability to optimize. Go’s compiler (6g, which is what I used, because gccgo isn’t available on my Mac) is new – although, I don’t doubt Google’s ability to eventually bring it down to the point where there’s very little discernable difference in performance. My point is that if people are going to pretend to be scientists, doing benchmarks, collecting statistics, and presenting them as proofs, they really should be more careful about their code.
By the way, I’m not a Go programmer. I don’t even know Go, beyond what I’ve learned by fixing Serge’s code. I haven’t read the tutorial, and this was the first time I’d ever compiled a Go program. There’s something to be learned about that: both Vala and Go are readable enough that I could easily see what two programs were doing and identify the functional differences. Ironically, I did play around with Vala when it first came out, so I actually had more experience with Vala when I first came across Serge’s code. For what it’s worth, after this exercise, I’ve found that I prefer Go, and could see myself writing code in it. Vala is more like C++; it’s basically C with some tacked-on features – that’s one reason why it’s benchmarks are so close to C’s. Go is a new language, with C-like syntax – sort of like Java was, when it first came out, only less so. I’ll have to see how goroutines work out; if they’re not as effective as Erlang’s processes, I may lose interest.
Jorg Walter posted a sort of critique focusing on Go (well, mostly it’s a highly subjective rant) that gives a good overview of some of the more compelling features of Go; if you can ignore the opinions-stated-as-fact (which have the flavor of “only idiots prefer their braces on their own lines!") and the occasionally erroneous “factiod” (“Ruby faded away” – huh?1), it’s an interesting read.