Edge computing has existed since the 90s, and the premise is simple – bring data closer to where it is processed. The technical benefits are clear: reduced latency, bandwidth optimization, improved reliability, enhanced privacy and security, scalability and agility, real-time decision-making, and cost efficiency. The environmental benefits are also clear – by reducing the need for massive, centralised cloud infrastructure, operational issues attached to energy consumption and heat generation can be mitigated.
As an industry, it is set to grow substantially over the next five years, with spending on hardware, software and services expected to be in excess of $350bn in 2028. Many organisations utilise a hybrid approach, maintaining some on-premises technology, with other services in the cloud. Rapid adoption of AI is changing the landscape however, and to cope with growing demand there is already a material impact on energy and communications infrastructure.
There is a great deal of press coverage about this issue at the moment, with clear tension between the technology needs and expectations of consumers and organisations, and the impact on the environment and existing infrastructure. Energy consumption alone puts a huge strain on grids that were conceived and built with much lower capacities in mind. In the UK, the six Distribution Network Operators (DNOs) are slowly adapting, installing super grid transformers (SGTs), but these upgrades need to address a range of initiatives, including wind farms, rail electrification and other projects.
xAI’s Grok 3 is using a cluster of 100,000 H100 chips during its training, currently ongoing and expected to last until December. The 700 watts required by each chip, and additional estimated costs suggest that the super cluster will consume around 150 megawatts a day, roughly the same as a small town. Water usage for cooling has long been commented on and recently Google has had to rethink its plans for a $200m data center in Santiago. In a country that has been experiencing a drought for the last decade, it’s not surprising that there is concern for the city’s aquifer being used to support a facility which would have been reliant on water for its cooling requirements. Grok 3, for context, is estimated to need 5 million litres of water a day to keep temperatures manageable.

Edge technology addresses some of these issues, however, and the sector is growing as the use cases and demand both increase – the micro data centres segment, as an example, is expected to realise 13.4% CAGR in the period to 2030.
Edge AI is already being applied in a range of different environments:
Consumer – content delivery, gaming, and AR/VR
Enterprise – intelligent warehouses, micro data centres, remote/branch offices, smart retail stores
Industrial – private 5G, factory inspection, medical devices, manufacturing automation
Embedded – robotics, drones, intelligent traffic systems, autonomous checkout.
Demand for real-time data processing and analytics is only going to increase as the benefits permeate more of our technology infrastructure, but the notion that these needs can be addresses by huge, strategically placed resource-intensive sites is flawed. Responsible use of AI is critical, and a mature strategy will incorporate inference at the edge, successfully addressing data governance, security, and energy usage.
If your organisation is interested in exploring the benefits or improving the way AI is utilised, including finding the right talent, please get in touch. At Tenon we’re happy to help you understand your options and ensure that decisions you make around AI are properly informed and add value to the business.

Leave a comment