The energy grid is transforming – it’s time for a new system of distribution management and control

by Gavin Doyle | Nov 3, 2025

The UK energy grid is, like many across the world, undergoing its most profound transformation in a century. For distribution network operators (DNOs), the old ways of working are becoming obsolete, due to more localised variation in energy generation and consumption. To navigate this disruption, I’m a strong advocate for a network voltage performance index (NVPI) to measure network performance and determine appropriate action.

As electrification of heating (air source heat pumps) and transport (electric vehicles) accelerates, the demands on the lower levels of the grid are growing. But the sheer size of the network as the domestic property level is vast, with hundreds of miles of cables and tens of thousands of substations. Monitoring the quality of the network to pinpoint areas for maintenance and upgrades is an ever-increasing need.

The very definition of a ‘good’ network now extends beyond simple reliability to encompass its role as a fair and efficient enabler of clean, resilient and secure energy supplies. Against this backdrop, an NVPI – with a single holistic score to quantify the health and readiness of distribution grids – would bring great benefits. Such an initiative, which I heard described by future network consultant Stewart Reid at a recent Utility Week event, could become an essential tool for steering billions of pounds of investment.

But first we must address critical questions. What variables should the index be made up of and how do we negotiating complex trade-offs between them to achieve the required measurement and actionable insight?

The traditional approach would involve a central committee of experts slowly analysing these with significant stakeholder engagement. This process would struggle to keep pace with the size of the network and the necessary speed of change. A radical alternative is needed, one built on a universal language for intelligent systems.

From centralised human participation to AI decentralisation

A comprehensive NVPI would need to be built on a foundation of data. The index would require inputs from dozens of sources: real-time smart meters, grid-edge power quality analysers, financial records, customer databases, and even socio-economic datasets like the index of multiple deprivation (IMD). This presents two profound challenges that render basic algorithms insufficient.

First, how do you make the appropriate judgments to calculate the metrics? The relative importance of different indicators may not be uniform. The weighting for ‘network resilience and LCT readiness’ should arguably be higher in a suburban area with rapid EV charger adoption than in a rural area with stable demand. A simple, centralised algorithm struggles to apply this kind of dynamic, location-specific judgment.

Second, how do you access this huge volume of data without creating a crippling overhead? The cost and compliance risks (e.g., GDPR) of centrally storing every voltage reading from millions of smart meters are immense. The challenge is to extract the necessary intelligence to make the index meaningful without drowning in the data itself.

This is where innovation must focus. An agentic AI approach is uniquely suited to solve this data dilemma. Agents can be designed to make decentralised judgments, applying different weightings and scoring based on their specific local context.

And to solve the data overhead, agents could employ techniques like federated learning, processing data locally at the source (e.g., within a substation’s IT systems) and only sharing the resulting insights. This minimises data transfer and storage, reducing both cost and compliance burdens.

How MCP and AI agents offer a potential solution

Imagine a digital ecosystem where every stakeholder is represented by an autonomous AI agent. This is made possible by the model context protocol (MCP), a new standard that acts as a universal translator, allowing different AI agents to communicate and negotiate.

This creates a ‘model of agents’. Instead of a centralised model, the MCP provides a shared environment where each agent plugs in, understands the state of play, declares its goals, and proposes actions in a standardised way. Each agent is still driven by its own unique prime directive, but the MCP governs how they interact. The process is built on three core components:

  • Context. The central system publishes the current state of the NVPI as a standardised ‘context’ object. All agents can instantly read this to understand the current weightings and scores for every indicator, from such as time outside statutory limits to social equity score
  • Intent. Each agent explicitly declares its goal. The LCT developer agent broadcasts its intent to minimise constrained access costs, while the Ofgem agent declares its intent to maximise consumer value and protection. This creates transparency
  • Action. Agents propose changes by issuing standardised ‘actions’, such as a proposal to increase the weighting of Power Quality. The MCP’s rules ensure every action is valid and auditable

Addressing the data challenge

The ambition of the NVPI is built on a mountain of diverse data from dozens of sources. Teams here at CC are already trying to tackle this through the creation of Cloud Energy, an interoperability platform for establishing and connecting energy agents and enabling them to collaborate within a hierarchy. But beyond such initiatives, profound challenges remain. How do you access data and make appropriate judgments, especially when the ‘correct’ weighting for an indicator might vary by location? A simple, centralised algorithm cannot solve this.

The MCP framework is the key. It allows agents to request specific, contextual data to inform their judgments. An agent representing a specific region could pull local LCT uptake data and dynamically adjust its ‘intent’ to prioritise network resilience and LCT readiness. Also, by standardising data requests, the MCP enables decentralised data handling like federated learning, where agents can learn from sensitive data locally without requiring the utility to create a crippling and high-risk central data store.

Powerful strengths and opportunities

This approach would create a powerful system informed by:

  • Interoperability and scalability. Any stakeholder can develop an MCP-compliant agent and plug in to the negotiation. This makes the system open and massively scalable, allowing new market participants to easily join
  • Transparency. The explicit declaration of ‘intents’ makes the motivation behind every agent’s action clear and auditable for the regulator and the public
  • Dealing with complexity. The system can analyse an astronomical number of potential trade-offs, revealing hidden synergies and conflicts that a human committee would never find
  • Regulation. The process can be re-run instantly to adapt the index to new policies or technologies, making regulation agile and responsive

Confronting the challenges

Despite its power, this approach faces significant challenges:

  • The prime directive problem. MCP provides a perfect language for communication, but it cannot solve the challenge of perfectly translating complex human values into an agent’s coded prime directive. A flawed objective for a ‘vulnerable customer’ agent could lead to perverse outcomes
  • The black box problem. The emergent negotiation process between complex AI agents can still be difficult to interpret, posing a challenge for regulatory oversight and assurance
  • Accountability. If an AI-negotiated index leads to poor societal outcomes, who is accountable? The final decision on balancing competing interests needs to reflect human judgment

To conclude, I firmly believe that the most pragmatic and powerful path forward is a human-AI collaborative model. In this model, the MCP-powered agentic system does the heavy lifting. It explores the landscape of possibilities and presents a small set of optimised, index-linked options to the human decision-makers.

Essentially, the model acts as the ultimate consultant, clearly laying out the data-driven consequences of every potential compromise. This frees the human stakeholders to focus on other challenges and opportunities. If you a member of the DNO community, and you’d like to discuss any aspect of this topic, drop me a message. It would be great to continue the conversation.

Plug in, cash in: Pathways to accessing billions of untapped value in the future grid

Read report

Expert authors

Senior Consultant - Technology Strategy | View profile

Gavin advises clients on the impact of technology on their business strategy across multiple disciplines, including communications and digital services. His experience spans the industrial and energy sectors. Gavin is focused on developing innovative technology-driven strategies to enable businesses to thrive, while anticipating future technology developments and reacting to technological change.

Related insights

Deep tech

Are you curious about new technologies, and how they can lead to long-term sustainable value?

We think creatively at the intersection of business and technology, inventing solutions to redefine what you do.

Industries

You need a partner with intimate knowledge of your industry and proven experience of delivering value from deep tech breakthroughs.

Discover more about the work we do in your sector and how we can create real commercial advantage for you.

Insights

Take a look at the latest insights, ideas and perspectives from CC.

Explore a cross-section of up-to-date content on the deep tech trends shaping the future of business and society.

Careers

Are you looking for an opportunity for your abilities to be recognised, and make a real difference?

Whether you are just starting out or you’re an experienced professional, we would love to hear from you.