Eric Schmidt: AI Systems Improving 10x Annually, Pace Could Accelerate Further

Neural pathways converge in holographic exponential growth pattern; AI scaling laws visualization with superintelligence emergence implications

"The improvement of these now essentially deep learning algorithms is going up at about a factor of 10 a year," former Google CEO Eric Schmidt revealed during a recent tech forum, characterizing the rate as "mindboggling" and warning that this already staggering pace could soon accelerate dramatically.

End of Miles reports that Schmidt's comments came during a wide-ranging discussion at the historic PARC Forum, where he painted a picture of AI advancement that could soon outstrip human capacity to manage or match its growth trajectory.

AI scientists could supercharge already exponential growth

The tech industry veteran and co-author of "The Age of AI" outlined how AI's current growth rate could soon enter an entirely new phase of acceleration through what he described as AI scientists – essentially AI systems capable of conducting AI research themselves.

"The industry believes that in the next little while we're going to get to the point where we're going to not just have human scientists but we're going to have AI scientists. The industry believes that you'll have like a thousand people at OpenAI and you'll have 100,000 AI scientists."Eric Schmidt, PARC Forum, March 2025

Schmidt's assertion that these AI scientists could potentially match or exceed human capabilities has profound implications for innovation speeds. If these systems operate at the current factor of 10 annual improvement rate, adding potentially "a million AI scientists" to the development ecosystem would cause the innovation curve to spike dramatically.

The three scaling laws driving AI's explosive growth

The former Google executive offered a technical explanation for this accelerating progress, citing research on what he called "scaling laws" that govern AI advancement. Schmidt referenced a paper by researcher Dario Amodei that identifies three distinct scaling dynamics.

"There are really three scaling laws going on. The first scaling law is the one that you know about, deep learning. Deep learning is what ChatGPT works on, and that scaling law says that as you add more hardware and data, you get more emergent behavior. And we haven't found the limit – all that everyone says there is a limit, but we haven't found it yet." Eric Schmidt

The second and third scaling laws, according to Schmidt, involve reinforcement learning and "test time compute" – systems that update responses while delivering them. He emphasized that these latter two scaling dynamics are "just at the beginning," suggesting even greater improvement rates in the future.

Geopolitical implications of exponential AI advancement

Beyond the technical aspects, Schmidt highlighted how this dramatic improvement rate creates profound implications for international competition. In a scenario where one country achieves a seemingly modest six-month lead in AI capabilities over another, the exponential growth rate would make that gap virtually impossible to close.

"In network effect businesses, when the slope of growth is this steep, you never catch up." Schmidt

This dynamic creates significant national security concerns, according to the tech leader, who recently co-authored an article with Dan Hendrickson on superintelligence. He warned that as AI reaches the point where it can create "a thousand Einsteins and a thousand Leonardo da Vincis," the result could be "inherently destabilizing to world order."

Hardware demands to support growth

The computational resources needed to sustain this 10x annual improvement trajectory are immense. Schmidt described industry discussions around AI data centers requiring 1-10 gigawatts of power – equivalent to "two nuclear plants in one data center."

Each gigawatt of computing power, he noted, represents approximately $45 billion in hardware investment, with potential 10-gigawatt facilities representing nearly half a trillion dollars in computing infrastructure.

For this reason, Schmidt urged "the hardware folks in the room" to "build us more hardware" and emphasized the massive energy demands this growth trajectory will require. Despite ongoing work on more efficient algorithms, the former Google chief expressed skepticism that efficiency gains would outpace increased usage demands.

Read more