Home control system

The energy grid is a wide and complex system with many inputs and a huge number of outputs and consumers. Demand is fluctuating all the time, yet the information coming back from consumers may be months out of date. It’s possible to measure the inputs to the system at the distribution side, but that gives little insight into where the energy is actually going. Although there are models to predict demand, it’s hard to validate them without real, real-time data.

This is a common compromise in the sensing world – do you settle for a frequent, highly accurate measurement of just one or two variables, or can you accept many low quality measurements instead? The answer depends a bit on what you’re trying to measure, but in fact they can do much the same job.

For instance – an upstream oil and gas producer’s share price depends not just on how much oil they have ‘on tap’ but also the value of what they are able to produce. This clearly depends on a very direct, accurate measurement of how much oil is flowing – and how valuable it is. And how valuable depends on how much water and gas it contains – a ‘cappuccino’ of oil and foam may occupy plenty of volume but has less value than pure oil. It’s not enough to take a sample since the flow can vary second by second.



At the other end of the spectrum is a system which uses a much more indirect measurement to assess how hire cars are being driven. By monitoring the acceleration from a tag stuck in the windscreen, it is possible to detect how hard the car has been cornered and whether there have been any collisions. This information is a different type of ‘accurate’ – it allows lots of cars to be monitored to a reasonable degree, which is exactly what an insurance company needs.

The choice of ‘lots of inaccurate, indirect data’ versus ‘a small amount of accurate, direct data’ implies that there is a way to sift large amounts of data for information which is valuable. At a day-to-day level, the explosion of ‘fake news’ shows how relevant information can quickly be submerged under  a tide of irrelevant data. Fortunately, there are well established techniques for filtering what is relevant and what is not – generally based on a model as to how the system behaves.

An example of this is a home energy management system which uses many individual room sensors to predict how much an increase in temperature will cost. This was achieved by first learning the behaviour of the house and using it to calibrate a model. This model could then be used to predict what would happen if adjustments were made to the heat input.

Taking this to the extreme – what happens if you have thousands of machines to look after? It’s not practical to wire them all up to the Internet, especially if they are down a farm track in the developing world. We have proposed a system where the sensors are able to learn for themselves what is ‘normal’ and respond only where something has changed.

Looking across the industry from its widest point (consumers) to its narrowest (generators), it’s clear that ‘accurate’ can mean many things, from being able to give correct bills to knowing where to focus maintenance efforts.  These different needs will have both different target costs and quite different data requirements.  The type of sensor can only be chosen by looking at the overall requirements such as:

  • What actionable data is required? Is it more about estimating behaviour across many consumers to separate them into groups, or about getting lots of detail on a few?
  • What is the operating environment? All sensors are sensitive to their environment, but can this be safely ignored or calibrated out?
  • What sort of data network will carry all this information? There are many different ways to do this, they will all have strengths and weaknesses (latency, cost, update rate, consumer acceptance)
  • What are users expecting from this? There is a huge difference in how a user will treat something if it’s seen as beneficial or not.

From a discussion like this it’s clear that many types of solution exist, both in cost and ‘technology readiness level (TRL)’. The development needed can vary from taking an off-the-shelf solution and making it digital, to developing a complete new sensing principle.

This type of thinking, considering the sensing as part of a wider business solution is becoming the norm. Just as social media has become a major route from business to customers, modern sensing and data systems are becoming equally important. The control of the data and the ability to draw conclusions from it will depend on the correct design of the connection to the physical world – and the winners will be the ones who have considered that from the start.

Simon Jordan
Senior Sensor Physicist

Working in our sensing systems group, Simon specialises in navigation and communication. Before joining Cambridge Consultants, he spent ten years at Teledyne TSS, working on projects including electromagnetic pipe tracking/survey systems, ship steering systems, marine motion sensors, and the development of high grade inertial navigation systems.