# C[omp]ute

Welcome to my blog, which was once a mailing list of the same name and is still generated by mail. Please reply via the "comment" links.

Always interested in offers/projects/new ideas. Eclectic experience in fields like: numerical computing; Python web; Java enterprise; functional languages; GPGPU; SQL databases; etc. Based in Santiago, Chile; telecommute worldwide. CV; email.

© 2006-2013 Andrew Cooke (site) / post authors (content).

## Go Rocks - How Can We Avoid Something This Bad In The Future?

From: andrew cooke <andrew@...>

Date: Sat, 9 Jul 2011 20:30:15 -0400

Go (the programming language) is no longer new.  But I don't know of a good
explanation for the ambivalent reactions to the language.  So here's my take.

Go is very much the child of its parents (the people who brought you C).  And
it's clear why some people (who typically have programmed in C, Java and
Python) will love it, while others (from a more academic background, who know
Haskell and Scala) will feel it's an ignorant failure.

Both groups have their points - it's a pity that the language couldn't have
combined the practical experience and good, simple taste of the designers with
some more "theoretical" ideas.  Instead it's hard to avoid the idea that it's
a language "proud of its ignorance", and that it suffers as a result.  On the
other hand, what's the alternative (clue: FS)?

OK, so first I'll describe what Go is, and WHY IT'S SO GOOD.

Duck Typing

Go "makes duck typing static" - anything that contains the required set of
methods implements the interface (even if the person writing that code had no
idea the interface existed).  So it combines (much of) the flexibility of
Python with better checking of types.

It also has a simple form of local type inference, so that you don't need to
specify types where they are "obvious" - only on the arguments and return
types of your functions.  This helps make the duck typing more efficient -
method resolution is "pushed back" to when the interface first appears, and
does not need to be repeated on every use.

Given all that, you might think that Go is "object oriented".  And it is, in a
sense, but you don't have "proper" inheritance.  Personally, I don't think
that's a problem.

Memory Management

Memory management frees you from worrying about who "owns" a reference.  And
in a C-like language that's a huge deal - it opens up a whole world of
possibilities: first class functions, closures, continuations...  Java should
have exploited this, but if you compare Go with Java, it clearly comes out on
top: first class, anonymous functions, with closures (coming in Java 8,
honest).

If you have experience with functional programming, but need to write the kind
of low-level code that has traditionally been C's domain, this is awesome
news.  And if writing "low level" code with managed memory seems like a
contradiction, you may be underestimating Go - it still has "new" and
pointers, so you don't lose too much control, but you gain a pile of
flexibility.  This is a big, big win over C (and Java).

Go also supports lightweight threads (these seem to be real threads, not
coroutines).  It also has a channel abstraction (think CSP) that, when you
first see it, might have you thinking "oh, cool, Erlang" - but it's not;
everything is inside a single process.  So this is a nice interface to local

So Go is C with memory management (Java really did help the world), a cool
take on Python's types, and local threads.  A modern C.  Everything that Java
and C++ failed to be.  Small, elegant, powerful.  And that's great.  Really.
It rocks.  It's not a hugely ambitious language, but if this is the next
mainstream programming language then we have progress.  More impressively - it
can be the next systems language too (for some definition of "system").  And
it's *way* easier to use than C++.

In a world that knows about Java, C, Python, C++, this is one more step in the
right direction.  Excellent.

Second: so, WHY DOES IT SUCK?

To understand that, you need to understand the academics.  There's a lot of
research done on language design these days.  It's big (university) business
and it's making real advances.  Largely because it's found a "scientific" way
to approach the subject.

Imagine when physics was just starting to get off the ground.  Back when
people were discovering electricity, for example.  Everyone was like "hey,
there's this cool trick I can do with the skin from a dead cat and a glass
rod!"  "A dead cat?"  "Yeah!"  Which was fun.  But then maths got involved and
we began to see the underlying connections - the patterns, the regularities,
the way that light and electricity and magnetism and radio waves are all
interconnected.  A systematic, mathematical approach connected a wide range of
different ideas.  And we gained from that.  Radio, TV, microwave dinners.
Computers, even.

The same thing is happening with programming languages.  Mathematical ideas
unite different parts of languages.  When you spot those ideas you can join
together "different bits" to give something more uniform, and more powerful.

Go ignores all that.

You don't need to understand the maths to see this in action.  You can see
that things are not well thought out just by looking at the language.

Take functions.  Functions are cool.  Read an intro to functional programming.
First chapter, you're going to be recursing.  Sure, when you start, it's a
little confusing.  But then it "clicks" and you see how it can help you make
programs cleaner.  And it feels good.  Boy, does it feel good.

We've known this for years.  Functional programming languages have hammered
out all the tiny details.  It's easy to get this right.  It's easy to
implement.  Yet Go fucked it up.  Not in a big way, sure, but it's the details
that count.  It turns out that, in Go, you can't write a self-recursive
anonymous function - there's no way to make it refer to itself.  Well, you can
do a dirty hack that's the "official work-around" (I'll let you google for the
bug report).  But how could anyone, in this day and age, get that wrong?  What
were they thinking?

It doesn't stop there.  You can't use recursion much anyway, because Go
doesn't guarantee efficient tail calls.  Which, again, is just dumb.  There
would have been no great implementation cost - so simple ignorance means that
you can't use Go to write code in a certain style.  I'm not saying that style
is always good, but I'd like the choice.  It's like being in a workshop that
only has hammers...

Another example.  The "academic" approach looks for patterns.  It tries to
reduce the language to a few simple, elegant ideas.  An important result of
this is that you, as a programmer, become as powerful as the language
designer.  What do I mean by that?  I mean that by exposing the "deep" ideas
there is less "magic" that only the language implementation can do.

A good example of this is continuations.  Continuations let *you*, the
programmer, implement exceptions.  Or functionality like Python's "yield".  So
someone can build a language that contains continuations (whatever they are -
the details don't matter here) and then "for free" the programmer can "extend
the language" with features that other languages have built in.

Now, continuations are, arguably, hard-core.  No-one is criticising Go for not
having continuations (well... now that I ask, why doesn't it?).  But that's an
example that illustrates the related "design smell": if the language is doing
things that you, as a programmer, can't do, then it's likely not a
well-designed language.

And Go does things that you can't do.  There's this little command called
"make" that does some weird magic.  It's like "new" but does extra stuff.  And
you can't extend it.  What?

Or: you can only use certain types as hash keys.  The same bad smell: some
types (known to the system) behave differently from the ones you, as a
programmer, make.  Why?

By now you're starting to get suspicious.  So you start asking yourself: what
else have they ignored?  And you look at the type system.  And it's kind of
flakey.  A decent modern type system guarantees no run time errors.  Not Go.
And the "comma OK" pattern?  Isn't that a broken version of the maybe type?
Oh, I guess Go doesn't use that because it doesn't have pattern matching.  Well
why not?  The list goes on...

Now some of the above objections can probably be explained by efficiency.  And
that's a good excuse.  It's important to make a better C.  Believe me - I've
been programming in C for the last month.

There's another big reason: Style.  Go is a stylish language.  It's not
necessarily elegant, but it feels right.  Despite all the above, it's simple
and understandable.  You know you can use it to get the job done.

But wait!  Surely an academic, elegant language should be simpler still?
Those "deep" patterns should make things easier!  Yeah.  You'd think so,
wouldn't you?  But go look at Fucking Scala.  Yup.  Need to wash your eyes out
with bleach?  Thought so.  Fucking Scala (that's the official language name, I
believe) is the best argument I have that Go's creators know more than the

So is that it?  Is there really no better compromise?  Do we have to choose
between Go and Fucking Scala?  You would hope not.  But network effects are
more important than technical chops.  I can live with Go.  I couldn't live
with Fucking Scala.  Sure, I'd prefer Haskell, ML, or even a decently typed

All we can ask, then, is how to avoid this in the future.  Next time someone
with style and clout sits down to write a language, how do we make sure that
they're aware of the last forty years of language theory?

------------

One final idea I didn't manage to fit in above: duck typing relies on
conventions.  To exploit the language to the full you need to know these.  Now
they're documented in API docs, of course, but I get the feeling that there
should be something more.  To program well with duck types you need, more than
in other languages, to "get" the "culture" of the language.  I feel like Go
should have added something here, but I don't know what (it has quite a rich
set of tools for an independent language).  Maybe I'm talking rubbish.  It's
just something I wanted to get out there....

And here's a thread that gives some pointers to earlier work that might have
http://research.swtch.com/2009/12/go-data-structures-interfaces.html

Andrew

### Re: Micro Languages

From: andrew cooke <andrew@...>

Date: Sun, 10 Jul 2011 09:59:36 -0400

That's a really interesting viewpoint.  You may well be right.  I wonder if
the same argument applies to type systems?  Should they also be monolithic?

I need to think some more, but thank-you very much for the comment.

Andrew

### Some Links, Clarifications and Corrections

From: andrew cooke <andrew@...>

Date: Sun, 10 Jul 2011 15:07:39 -0400

This post was surprisingly popular, yet many people seemed to miss my point.

To see the the discussion, check out
http://news.ycombinator.com/item?id=2746698

When I wrote the post I hoped to do two things:
- Explain to people who like Go why it's been criticised and "looked down on"
by some people.
- Explain to people who have criticised Go why it's such a good, useful
language.
And, in my wildest dreams, as a result of that, help somehow fuse the "best
bits" of both.

Of course, my starting points may be wrong.  I don't think I have
misunderstood why Go is so good, because I am a programmer that has used C
quite a bit; it's more likely that I have either mis-characterised or
mis-understood the criticism of the language.

So I was most worried (because I would look dumb; on the other hand I would at
least learn something) that people would explain why Go's design *had* to be
that way (for example: perhaps the "duck typing" approach is so radical that
there is simply no research in that area).

And there were some comments in this direction.  Thanks for those - I will
check out the various points made.

But mostly, it seems, my audience was more on the "Go is good" side.  And
there I seem to have failed miserably in explaining why not everyone feels the
same.  I don't think I saw one comment that suggested that I had helped
someone see "the other side".  Instead there were dismissals,
mis-characterisations, and arguments aimed at my specific examples rather than
the more general points I was trying to make.

Anyway, I need to get back to coding.  I am trying to use Go's impressive
support for image generation and have just spotted a cool way to use the type
system (I think I can extend the image package from "outside" to use HSV
values as well as RGB).

Cheers,
Andrew

### Re: Go Rocks - How Can We Avoid Something This Bad In The Future?

From: Jonathan Wright <quaggy@...>

Date: Sun, 10 Jul 2011 20:22:30 +1200

Hi Andrew,

Is the following a fair summary of your views?

Thesis: "Next time someone with style and clout sits down to write a
language, how do we make sure that they're aware of the last forty
years of language theory?"

Examples of how Go ignored language theory:
- have to use work around to create a self-recursive anonymous function
- no tail call optimisation
- "Continuations let *you*, the programmer, implement exceptions."
(and coroutines and so on)
- "'Extend the language' with features that other languages have built in."
- Go has builtins that you can't replicate
- Go has primitive types that are special

My take is that rather than Go ignoring years of language research,
that the complaints stem from Go being a monolithic language rather
than a microlanguage. I've stolen the monolithic/micro distinction
from the kernel space, but the idea seems to apply here too.

Micro languages are wonderful. They're the near-minimum amount of
features required to build all features. Ideally they are massively
flexible and allow changing:

- semantics
- syntax (DSLs, addition of new features, ...)
- flow control (if, for, while, coroutines, exceptions, resumable
exceptions, ...)
- data storage
- ...

Nothing is really special. Everything can be replaced or changed at
compile or runtime. It is up to the library and application
programmers to create the language they need for the task at hand.

Micro languages have a long history: forth, lisp, scheme and friends.
I personally was attracted to the Io language and spent a number of
years coding in and on Io. The sky is the limit. They can be a lot of
fun to craft in.

In my experience micro languages have a flaw. They tend to attract
people who love, embrace and thrive with complexity. Almost anything
can change, and does. Idioms are critical, but it is hard to really
know how anything will behave from inspecting one fragment of the
code. The programs end up with millions of concepts rather than a
small set.

As in the kernel space, the monolithic tends to come out on top.
Monolithic languages have rigid semantics and syntax. The behavior is
obvious; what you see is what it does. There are simple semantics that
hold everywhere. The monolithic languages are not flexible. They have
a particular model and set of operations and that's that. If you want
to play with an interesting new style, you've either got to change the
compiler or use a new language.

To to back to the complaints, in the micro vs monolithic light they
boil down to:

1. no tail calls
2. no continuations
3. privileged builtin types and functions

To me, tail calls are a design decision for each language. The great
Python tail-call debate of 2009 nicely showed that with tail calls
certain things are easy, and without tail calls other things are easy.
Tail calls have a significant influence on other features in the
language around function calls, scoping and error handling. There does
not appear to be a clear consensus that the trade off should be one
way or the other. For now it seems prudent to explore both sides until
a clear winner emerges.

As for continuations and privileged builtins these are classic areas
of conflict between micro and monolithic languages. With continuations
all manner of wonderful flow control constructs can be implemented,
from if statements, for/while loops up to coroutines and retryable
exceptions. Very powerful. Rather confusing. If you want a tiny
language that can be extended indefinitely, then continuations are a
must have. If you want a language to sit down and use, with everything
being simple and obvious, then continuations mask actual behavior.

In a micro language, there will be few, if any, builtins and special
types. Anything can be changed except for a few atoms the entire world
is built on. In a monolithic language, there tend to be builtins types
or functions to provide stability and certainty. In micro languages
the builtins may well be hidden below libraries and low level, where
in monolithic languages, the builtins are higher level and visible.

Given the history, it seems unlikely that a mainstream language will
be a micro language.

Convinced? Not even a wee bit? :-)

Thanks,
Jonathan.

### Re: Goroutines

From: John Beshir <john@...>

Date: Mon, 11 Jul 2011 13:25:23 +0100

Just a small correction; goroutines are not "real threads", nor are
they traditional "coroutines", but they're much closer to the latter.
They are basically coroutines scheduled across an arbitrary number of
memory allocation, channel send/receive, and system calls.

They are MUCH cheaper than regular threads, their primary cost being
4KB of RAM for the initial segmented stack, and you can easily spawn
hundreds of thousands on a single system, more if you have the RAM for
it.

Extra real threads are created as necessary when they make blocking
system calls, so they behave like real threads, most of the time.
Network I/O uses a single thread with non-blocking I/O, which wakes up
goroutines when their operation is done, all of which is handled for
the programmer, who has a simple blocking I/O interface provided. This
means a goroutine per connection makes network code both fast and pretty
to write.

This is nothing other languages haven't done in some form, but is
significantly cooler than your description suggested, letting you
write concurrent code easily and with good performance.

(I should note the GCC implementation currently has a crippled
implementation of goroutines which requires an OS thread per goroutine,
but this is not the general or intended normal case)

This post makes a similar point, in a way that might appeal more to people
Andrew