MacAskill: AI Will Compress a Century of Progress Into a Decade, While Human Institutions Remain Fixed

"We're thinking about 100 years of progress happening in less than 10," warns philosopher Will MacAskill, describing a near-future scenario where artificial intelligence triggers a "technological acceleration" so profound it compresses a century's development into a single decade—while human institutions remain fixed at their biological pace.
Humanity faces an unprecedented "temporal asymmetry" between technological advancement and institutional adaptation, writes End of Miles, as the Oxford philosopher outlines a framework for understanding the dangerous decision-making environment this would create.
The Cuban Missile Crisis in 24 Hours
To illustrate this asymmetry, MacAskill offers a vivid historical counterfactual: if the technological developments between 1925 and 2025 had occurred in just ten years, the Manhattan Project and Hiroshima bombing would have been separated by just three months, while the Cuban Missile Crisis—which historically spanned 13 days—would have lasted barely more than 24 hours.
"The Cuban Missile Crisis lasts a little over a day. There's a close nuclear call every single year. This clearly would pose an enormous challenge to institutions and human decision making."Will MacAskill
MacAskill cites Robert Kennedy Sr.'s assessment that if decision-makers had faced a compressed timeline during the Cuban Missile Crisis, "they probably would have taken much more aggressive, much riskier actions than they in fact did."
The differential acceleration problem
The "temporal compression" MacAskill describes doesn't affect all domains equally. Areas requiring physical experimentation, regulatory oversight, or capital-intensive infrastructure will advance more slowly than purely intellectual pursuits like mathematics or theoretical computer science.
"Human reasoning, human decision making, and human institutions don't speed up to match the pace of technological development. In fact, a different way you could think of the thought experiment is imagine if the last 100 years had happened in terms of tech development, but humans just thought 10 times slower." Will MacAskill
This creates a fundamental "acceleration asymmetry" where new destructive capabilities, transformative technologies, and scientific breakthroughs will emerge before social systems can adapt—potentially creating catastrophic disconnects between technological power and institutional wisdom.
Beyond acceleration to transformation
According to MacAskill, the magnitude of change could exceed even this extreme century-in-a-decade projection, potentially reaching "many centuries, or even 1,000 years in a decade." This would create situations analogous to "a medieval king who is now trying to upgrade from bows and arrows to atomic weapons in order to deal with this wholly novel ideological threat from this country he's not even heard of before, while still grappling with the fact that his god doesn't exist and he descended from monkeys."
The "adaptive gap" between technological capability and human institutional response becomes the central vulnerability in MacAskill's framework—particularly as he believes we may be merely 3-7 years from the onset of these acceleration dynamics.
What can be done?
MacAskill advocates for immediate preparation through development of AI-augmented advisory systems for decision-makers, alongside institutions specifically designed for rapid adaptation. He emphasizes the potential for AI systems themselves to help navigate the challenges they create—if deployed thoughtfully and with sufficient foresight.
"The world would be acting very differently if it was taking seriously the idea of a potential, even just a significant chance of an intelligence explosion in the near term. And honestly, I'd feel just a lot better about the world if that were the case." Will MacAskill