From Cloud to Pocket: How AI Processing Is Moving Away From Data Centres

For more than a decade, the global digital economy has been built around a simple assumption: bigger data centres mean better technology. Vast warehouses packed with servers have powered everything from video streaming and online banking to artificial intelligence. But that model is now facing its most serious challenge yet.

The question gaining traction across the tech world is no longer whether data centres will grow — but whether they will remain the centre of AI at all.

A Challenge from the Pocket

The debate was recently reignited by Aravind Srinivas, chief executive of AI company Perplexity, who argued that the dominance of large data centres could one day be undercut by something far smaller: the smartphone.

Speaking on a podcast, Srinivas suggested that as artificial intelligence becomes more efficient and personalised, powerful AI tools could run directly on consumer devices. Instead of constantly sending data back and forth to distant servers, processing could happen locally — on phones, laptops, routers, or even set-top boxes.

This would represent a fundamental shift away from the centralised computing model that currently defines AI.

Early Signs of a Shift

The idea is no longer theoretical. Apple’s latest devices already process some AI tasks directly on-device through Apple Intelligence, using specialised chips to improve speed and privacy. Microsoft has followed a similar path with Copilot+ laptops, which include built-in AI processing capabilities.

Yet these remain premium products. Most consumer hardware still lacks the power needed to run advanced AI locally, meaning large data centres remain essential — for now.

The Scale of the Data Centre Machine

Today’s data centres are enormous operations. Often covering the size of multiple football fields, they house thousands of servers performing everything from cloud storage to AI training. Almost every online service relies on them in some way.

Tech giants continue to double down on this infrastructure. Billions of dollars are being invested globally, with around 100 new data centres currently planned or under construction in the UK alone. Their energy consumption is vast, and environmental concerns are growing.

Nvidia chief executive Jensen Huang has described these facilities as “AI factories”, arguing that rapid advances in artificial intelligence would not be possible without them.

The Case for Smaller, Localised Alternatives

Despite the investment surge, a quieter counter-movement is taking shape. Instead of fewer data centres, some experts envision many smaller ones.

Consultants and engineers argue that compact “edge” data centres located close to population centres could reduce latency, improve efficiency, and cut energy waste. Some experimental projects already exist — from small data centres heating public swimming pools to household-sized units warming private homes.

The idea is simple: if computing generates heat, why waste it?

Advocates say future cities could integrate small data centres into public buildings, housing estates, or unused commercial spaces, linking them into networks when large-scale processing is required.

Even Space Is Being Considered

Others are looking far beyond city limits. Companies are exploring the possibility of placing compact data centres in orbit, where cooling and energy efficiency could be improved. While still experimental, the idea reflects growing discomfort with the ever-expanding footprint of ground-based mega facilities.

Is the “Bigger Is Better” Model Cracking?

For years, the AI industry believed that scaling was everything — more data, more computing power, better results. But that assumption is now being questioned. As AI models become more specialised, they may require less brute-force computing.

Critics argue that not every AI system needs the vast capabilities of today’s large language models. A tool designed for medical diagnosis, for example, does not also need to generate poetry or pop lyrics.

If AI becomes smaller, smarter, and more targeted, the pressure to centralise everything inside massive data centres could ease.

A Gradual Shift, Not a Collapse

Few experts believe large data centres will disappear anytime soon. Demand for cloud services and AI continues to grow. But their role may evolve — from being the sole engine of AI to one component in a far more distributed system.

The future may not belong exclusively to either giant data centres or handheld devices, but to a hybrid world where processing happens wherever it makes the most sense.

And in that world, the dominance of the mega data centre may no longer be guaranteed.

Pakistan

Lifestyle

Automobile

World

Smart Stories for the Smart Readers

Smart Stories for the Smart Readers