This note is the fourteenth letter in the 104-days-of-summer-vacation series. You can also follow the full twitter thread here, and leave any thoughts and comments that might come up!
The book Superintelligence by Nick Bostrum, has been on my reading list for a while now.
Superintelligence is a book on artificial intelligence safety, and touches on a range of topics from existential risk to the ethics of human enhancements. Nick is primarily concerned with the implications of co-existing with superintelligent beings, especially given that intelligence seems to be the primary factor in determining the agency of a species. After all, the future of orangutans is determined more by us, than by orangutans themselves.
I first came across Superintelligence while I was browsing the rationalist forum, LessWrong. Rationalism is another thing that I’ve been meaning to read more about, I’m fascinated by the core premise of pursuing more accurate beliefs (be-radically-skeptical).
he road to wisdom? Well, it’s plain
and simple to express:
and err again
and less. – Piet Hein
I started reading Superintelligence over my trip, and it reminded me of one thing that annoys me about the rationalism community. This is the way Superintelligence is written using unnecessarily esoteric language. I’ll give an example from the book:
But even modulo a solution to how to create superintelligent AI, there remain the difficulties that arise specifically from the value-loading problem.
A person informed on computing terminology, will correctly infer here that what Nick means is “Even if there is a solution to superintelligent AI, …”, which would be a perfectly good way to phrase this sentence without creating any change to the meaning.
The use of modulo, in this case, seems to serve only the purpose of in-group signaling, by using the language (memes) of a community. This is not the only example within the book of such language.
This is not necessarily a bad thing, it’s just an annoying thing. Rationalism itself is a decentralized-brand and having esoteric language is helpful to create a strong sense of community. Geeks, MOPs and sociopaths in subculture evolution, explains that a strongly exclusive signal (through language or otherwise) is a useful feature to maintain the integrity and spirit of a subculture.
I think what irks me, is that the rationalism subculture is all about encouraging the pursuit of tru-er belief. But simultaneously using esoteric language gates many people out of that goal. Most people don’t know what modulo means, and the people who do will fit a specific template, isn’t the goal to make rationalism approachable to as many people as possible?
This drawback gives the whole thing a somewhat elitist flavor, when I’m sure that’s not the spirit of rationalism at all- the pursuit of truth should be a fundamental human right. And in that spirit, the language used should be inclusive to people from all domains. By this, I don’t mean avoid formalism, formalism should be used where appropriate to convey an idea.
But rather, say things as simply as possible. The Economist Style Guide echoes this:
The first requirement of The Economist is that it should be readily understandable. Clarity of writing usually follows clarity of thought. So think what you want to say, then say it as simply as possible.
That’s the sort of writing that I aspire towards, one which expresses complex ideas in simple ways without diluting meaning. And perhaps that’s something more people should try to do (consultants wink wink).