Seth Dickinson's Economies of Force
Yesterday I mentioned several of Seth Dickinson’s stories over at the podcast site. Today I want to focus on one in particular – Economies of Force. It is a transhumanist story so I’ll assume readers are already aware of common themes in transhumanist fiction – particularly the proposed ability of humans to edit their own mental make-up. Not just memories, but preferences and personality traits as well.
This has been speculated to allow for some pretty worrisome scenarios under sub-ideal economic circumstance, because (quoting Scott Alexander)
>brutal Malthusian competition combined with ability to self-edit and other-edit minds necessarily results in paring away everything not directly maximally economically productive. And a lot of things we like – love, family, art, hobbies – are not directly maximally economic productive. … [Bostrom worries] that consciousness itself may not be directly maximally economically productive. He writes: “We could thus imagine, as an extreme case, a technologically highly advanced society, containing many complex structures, some of them far more intricate and intelligent than anything that exists on the planet today – a society which nevertheless lacks any type of being that is conscious or whose welfare has moral significance. In a sense, this would be an uninhabited society. It would be a society of economic miracles and technological awesomeness, with nobody there to benefit. A Disneyland with no children.”
Yudkowsky introduced a different worry a few years earlier. In Amputation of Destiny he points out that the Culture of Iain M. Banks’s novels is rather bad for the humans, because there is nothing in the galaxy that is worth doing that humans can do. Anything they can do is done better by the Minds, and the humans exist simply because the Minds humor them. The humans are basically kept pets. The Minds are the true main characters of the story of the galaxy now. And while it’s good that there’s still intelligent life in the galaxy, it is sad that humanity has made itself irrelevant.
In Economies of Force, Dickinson has managed to bring these two together in a nicely chilling way. It is conveyed rather quickly that humanity is a kept pet species. No human really knows how the system works anymore, all major decisions are passed on to the machines, because the machines always make the best decisions. It’s beautifully described in a section too long to quote, which makes you feel the disorientation and helplessness in the face of forces human minds cannot comprehend. But everyone’s OK with this because it’s so darned efficient and works so well that humanity is basically living in a utopia. The story draws a picture of an idyllic world where life is easy and everything is nice. It certainly beats the pants off of what most people living today have to go through, even in the nicer parts of the world.
But humanity’s keepers are not like Banks’s Minds, because they have no sentience themselves. Humanity has made itself a pet species and the beings taking our place as the main characters of the universe aren’t even characters. They’re self-replicating animatronic gods.
The system which supports and cares for humanity, while making them irrelevant, has an immune system. To keep running it requires a certain mentality of the humans that comprise it. We need all our cells to work together, and if some cells stop working for the whole and start hoarding all the body’s resources to perpetuate only themselves, we start to die. We call them cancer, and excise them. Likewise, the non-thinking god is composed of humans, and the humans must work in concert. So when certain humans stop working to keep the system alive, they are excised. They’re even described in cancer terms:
>How could you be part of something that, on the deepest level, only cared about making more of itself? A network whose only value was more network, with no ambition to ever be anything more?
Who could live that way?
I read an interview where someone thought of the Loom as an alien virus, a scary zombie-like thing. I think that’s the wrong interpretation though. It’s a memetic hazard, and the really great part of the story is that the hazard is things that we the reader think are good. These humans become a cancer on the system when they start valuing the wrong things. Specifically:
>They just care about something different. Reaching other people, instead of reaching that new promotion, that new car. And they’re here, you understand? You’d be one of them right now.”
And that’s the terrible horror of the story. We can identify with the system, a being composed of smaller parts which must cooperate for us to continue living, and the necessity of eliminating those parts which would kill us in their selfishness. But we also identify with the cellular components as they are us, and the cancer that the greater system must eliminate is what we consider the most precious things in life, while the values the system needs are what we would consider banal, meaningless striving. And worst of all is that the system we are part of isn’t even a being with moral worth. It is mindless. We created what, from the outside, looks like a utopia that most people would kill for. But it is hollow. It is an example of one of the many things that could go very wrong if we create an AI that will give us what we think we want, but fails to fully capture all our values. It’s a demonstration of half of Eliezer’s warnings of faux-utopias that hits you on an emotional level.
(It’s not a true horror story though, but I’ve spoiled enough as it is)
The amazing part of all this is that a story that requires such a level of background knowledge got published in a major market. This is a testament to Dickinson’s writing skill – he made a story on these themes so compelling that it pulls in the average reader unacquainted with these esoteric minutia. When I read Dickinson it makes me want to give up writing entirely, because I know I’ll never be able to make anything as beautiful as what he’s put together. I won’t subject you to my continued ravings, but… damn! So good!