Pandora’s Box and Collapse

Monday March 1, 2021 — Taipei, Taiwan

A pretty standard way that people think of technology is as a sort of Pandora’s Box: once you’ve invented something, you can’t un-invent it, so you need to deal with whatever consequences you’ve created. Maybe you stop using something if you see it’s causing something bad, but you also need to convince everyone else to stop using it as well if you want to have any effect, which is extremely hard: if a piece of technology gives its user power, then unscrupulous people are sure to use it regardless of the externalities. It’s only by collectively banning the technology and enforcing punishments for its use that we can actually stop that harm.

That’s why it’s interesting to me to come across a different model: Ran Prieur talks a lot about “collapse,” recently writing:

February 25. Taking another angle from Monday, The internet as we know it is doomed. It’s by Annalee Newitz, who wrote that new book about ancient cities. Her argument has two parts. First, that there were two waves of ancient cities, and the first wave failed because it didn’t have the right institutions to manage population density, so people got unhappy and left.

Then she argues that the internet is the same way. It’s getting bigger and clunkier, and the costs are beginning to outweigh the benefits, so that people are now trying to live without it. Maybe the internet will fade away, and eventually “return in a form we can only guess at.”

Chris, who sent the link, comments:

Every time we add extra complexity to our world, there is a decrease in the power of any single person to comprehend the society and technological foundations thereof. It feels psychically unsustainable. The state asks citizens to manage a baseline amount of technical overhead to have a modern life, but no one ever stopped to ask how much overhead it ought to take for our world to be mediated by the internet.

I think this is a big factor in the anxiety epidemic. I’ve said this before: the prophet of our age is not Orwell or Huxley, but Kafka. Password requirements have become so labyrinthine that I can’t possibly remember them all, and I don’t trust my computer to keep track of them, because I’ve seen both software and hardware unexpectedly fail. So I keep them all written down on a piece of paper, and I ever lose it, I might as well go live under a bridge.

In a high-complexity society, I live in the shadow of dread of all the things that could go wrong, that I would be responsible for fixing and have no idea how to fix. The thought of total technological collapse is comforting, because we would all be in the same boat, and our troubles would be comprehensible.

This reminds me of CGP Grey, responding to the question “Do you think the next ten years will be better or worse for humanity, and more or less impactful on human history than the previous ten?”:

Thinking about impact over the long run of human history, it’s pretty clear that impact tracks with tech and tech with time. Ten thousand years ago, there was very little you could do to impact the world. So I do expect the next ten years will be more impactful than the last. And I will bet they’re better than the last. I’m less confident about that second part than I used to be, but I still think it holds that, on average, each decade is an improvement. That is, until we accidentally stumble into an existential threat. Which is game over. Which is pretty… pretty bad.

It seems pretty clear that the Pandora’s Box model is inescapable, right up until we invent a technology that makes collapse inevitable. We’ve done a pretty good job of that with nuclear weapons,Which don’t get as much focus as they should in x-risk circles, IMO. Climate change is much more sure to kill us slowly, but with nuclear weapons a single mistake that could be made today could easily catalyze a global collapse. And no, “AI” is not the thing that will catalyze collapse. but I don’t see any reason why a more potent technology to cause collapse couldn’t be or hasn’t been invented.

It might also be possible to bootstrap our way out of this — if we build a society that cares about tail risks and is willing to take collective action to intervene in the systems that are likely to cause collapse, I don’t think it would be implausible for civilization to have a almost-indefinitely long run. I’m just not very hopeful of that, from what I’m seeing right now.

changelog
: Add "Pandora's Box and Collapse" entry
: Fix typo