The End of the World: The Lottery of Mutual Self Destruction!

The illustration above shows a bag of creativity containing thousands of balls. Each ball represents a somewhat unclear or unknown technology that advances or influences humanity in a major way. A few examples of balls that have already been pulled from the bag may include petrol engines, the internet, flight and nuclear power. In amongst these technological balls is a black ball, representing the one time a technology that assures a very high number of people are going to die or even the complete destruction of humanity. This black ball is the final ball.

We, as humanity, continually pick balls out of the bag. We continually strive to create and develop new technology without too much consideration of its potential negative effect on the world. It’s only by chance that we’ve luckily avoided blowing ourselves up and not picking out the black ball.

We have been very close however. A good (or rather bad) example of a potential black ball from the past is the development of nuclear weapons. The drive to split the atom, to release and harness this massive energy and then through the Manhattan project, to develop weapons. This drive, step by step, by scientists to develop a bomb was largely unchecked and we soon found ourselves on the precipice of mutually assured destruction.

The checks and balances to police, restrict and protect the population of the world was left to chance. The proliferation of nuclear weapons was self limiting only because the technology and construction was specialised, expensive and time consuming which only a few countries could adopt. Just imagine if anyone could make a nuclear weapon with a bag of sand and a microwave?

So, if there was an easy, unfettered way to access the technology to detonate an atomic bomb, we could be fairly certain that it would have been used, even if just a small number of individuals would wish it. There are significant numbers that have radical religious ideals or ideological doctrines, mental illness, a dislike of the culture in which they live or insular, oppressive political dictatorships and their opposition and those that seek to extort and threaten. There are many reasons, unfortunately, that killing a large number of people seems like a good idea to an outspoken minority of the global population.

Today nuclear weapons are not an immediate black ball. It was close, but thank goodness for chance. We are now continuing to pick out balls from the bag so who knows how technologies like artificial intelligence, digital DNA printing and synthesis, bio technology or the final effects of global warming and climate change will influence and effect us all.

The question is not so much whether we will stumble onto a technological black ball that wipes us all out but rather that we have nothing in place to reduce the risks or stop it if we do. How do we stop developing things that will kill us like the nuclear weapons that were a mere hovering finger away from destroying the planet? The measures could involve stopping or restricting the development of such technology or ensuring that there are no bad people. We could also have effective policing and monitoring of individuals that could cause harm and intervene if action is required in a somewhat dystopian totalitarian future under an effective global governance.

I’m sure we’ll continue to keep pulling out those technological balls with our current insatiable drive. Let’s hope we don’t find a black ball.


Listen to an outline of the ‘Vulnerable World Hypothesis’ by Nick Bostrom on the Sam Harris ‘Making Sense’ Podcast. The section on this starts at 28.30 minutes if you want to go straight to it.

Read the ‘Vulnerable World Hypothesis’ by Nick Bostrom. This opens a PDF.

Nick Bostrom’s website also contains other interesting articles.

Advertisement