Before I speak about the book, I have to warn everybody that it has a 3-page preface by Steve Bannon. If that’s too traumatic, then don’t read. For the alternatively situated, I have to mention that Allen despises Elon Musk (although not for the same reason that the Left does) and has no interest in Trump (also for different reasons than the Left). If that traumatizes you, also don’t read.
Now that the sensitive snowflakes are gone, I can tell everybody else that Dark Æon is a messy, ranting, raving, extremely erudite and highly entertaining gem of a book. People in Amazon reviews complain that it’s hard to understand and, yes, AI is not an easy subject. Allen does everything possible to help the reading go down easier but you still need to have a developed human intellect to understand artificial intelligence.
The main idea is that our techie elites are driven by a two-pronged fear. They are terrified of death because they have an overinflated sense of their own importance. And they are also convinced that the AI will soon become so powerful that it will end humanity in its present form. Nobody really knows what the AI will ultimately become but the fear is already here, especially among the creators and investors of the AI. As a result, they have become convinced that the only way to address these two fears is to strive for a technologically augmented humanity. At least, for the humans who matter. Those who don’t matter (like me and you) can be used as a pool of experimental subjects. If some don’t agree, they can be shut into basic-incomed but otherwise rejected enclaves of useless people. Or humans 1.0.
This plan will ultimately fail because God is great (Allen is very religious) but it will do a lot of damage in the meantime. Allen outlines a detailed plan of what each of us can do to preserve our own normal human subjectivity and says that everybody is personally responsible for maintaining the lines of cultural transmission.
I’ll say more later but let’s sit with this part for the moment.