Extinction is the rule. Survival is the exception.
Worth reading. Supertelligence by Bostrom. We need to be super careful with AI. Potentially more dangerous than nukes.
(Tweet by Elon Musk, CEO of Tesla)
The ultimate risk
The Atlantic several months ago published an interview with Nick Bostrom, the director of Oxford's Future of Humanity Institution. Bostrom's contention is that human extinction risks are not well understood. (You can click onto the complete article We're Underestimating the Risk of Human Extinction below.)
While Bostrom talks about present threats to humanity, he is clearly more concerned with longer-term potential existential threats, which he thinks is where our focus ought to be. Among those at the top of his list would be artificial intelligence, while in the nearer term it could be what we might do in the areas of biotechnology and synthetic biology.
As well, it's Bostrom's contention that one of the very worst things humanity could do is to slow down or halt technological evolution because that in and of itself would “constitute an existential risk.”
In short, according to Bostrom, our “avoidable” misery will only get much worse if we can't improve the quality of life, and for him technology is the key, or at least, a central priority we can't afford to reduce. Bostrom, however, goes on to say that it would be difficult to slow down technology very much, let alone bring it to a halt because of the constituencies pushing scientific and technological priorities, along with economic interests and assorted individual and institutional priorities. Well possibly, yet....
Considering the finite then and now
In Thomas Cahill's wonderful book How the Irish Saved Civilization, he recounts a winter day in 406 A.D. Roman soldiers stood on one side of the frozen river Rhine, while on the other side were the barbari, thousands upon thousands of hungry and determined barbarians from assorted Germanic tribes determined to cross over to where the Roman legions now stood, the defenders of the “civilized” world.
Let it be said that virtually no one on the side of the Rhine where Roman soldiers now guarded the culture and the glory of antiquity and possessed the most advanced technology in the world—even at this late date—would you find people able to imagine that the Eternal City of Rome and its legacy would crumble and vanish. It would have been incomprehensible.
The late Kenneth Clark in his book Civilization asks the question, why did the Greek and Roman civilization collapse. His answer is that it was exhausted. Antiquity had run out of steam. It had become a static world as Cahill points out.
Doing the expected had become the highest value. Fear of war, fear of invasion, fear of plague—fear of everything was the norm. Why bother to do anything? Late antiquity had become a world of empty rituals, obscure religions and the wholesale disappearance of self-confidence.
The barbarians? The disappearance of self-confidence? The Islamic State currently ravaging parts of Iraq and Syria, a determined collection of murderers and psychopaths, with their longing for the Middle Ages, would have been a familiar sight to the citizens of the Roman Empire by the second decade of the fifth century.
The latest discoveries regarding our cousins the Neandertals, no longer thought to be ignorant brutes, suggest that the interactions between them and humans may have lasted considerably longer than we once thought. There is also speculation now that the Neandertals may have gone extinct because, unlike Homo sapiens, they were unable to adapt quickly enough to climate change. But of course we can't draw exact comparisons and we don't know what the future will be.
Will technology be our salvation as some believe or is there something else required? Will the elephant and the rhino survive the slaughter caused by China? Will humankind somehow stop doing the expected and rediscover a new self-confidence?