Customers have already gone through their own digitization and await the same progress from their utility. New assets bristle with sensors and legacy networks are being retrofitted to enable automation and the connection of distributed energy resources.
Most geospatial data is vectorised. Back office functions have access to ERP systems full of data on financials, people and assets. Doesn’t that mean utilities are data-driven already? The answer is a resounding “no”.
Having lots of data and actually extracting significant value from it are two entirely different things. There’s huge value to be derived from well thought out data integration and analytics at any time. But, in the world of disruption that utilities face today, that’s even more the case.
As an example, let’s consider networks businesses. I’m an ex distribution engineer, so it is something of a comfort zone for me. Distribution networks all over the world are at some point along a complex continuum, perhaps beginning with distribution automation and automated metering infrastructure, but moving all the way along to self-healing; volt/VAR optimisation; and significant implementations of demand side management and distributed energy resources.
Each of these requires data. And to an increasing extent, analytics to deliver on their potential. They also increasingly require the smooth and efficient functioning of all the components making up the network.
At best, wasting money. At worst, otherwise avoidable failures - with all their associated dangers, metaphorical and literal.
It might be too much to expect a single suite of applications to be a front-end all for all of these requirements (though I can think of a few vendors who will tell you otherwise …)1.
But we’re talking about the data-driven utility, remember? So what about the data? To a great extent, many of the requirements I’ve talked about above use the same data.
Data such as current, historical and forecast load; network topology; demand; voltage; location; asset specifications and history, etc. Since that’s the case, wouldn’t it make sense to have a single true version of that data for all of them to reference?
A single version of network and asset data will eliminate errors due to duplication, data conflicts and missing entities. It will provide definitive answers to questions previously argued over by different departments. It will ensure that everyone – and every system shares one version of the truth.
These are significant benefits in themselves. And when added to the financial gain to be made by switching off a whole fleet of now redundant operational data stores and other databases, can sometimes be enough of a business case in itself to trigger investment in an integrated data warehouse.
But there’s so much more to it than that. This is the starting point for that journey to becoming data-driven.
Integrated data allows the business to ask the kind of questions they’ve always wanted to, but didn’t. Because waiting 6 months for IT to manually integrate all the data really wasn’t an option. And it allows data discovery: not only asking new questions, but just taking an informed look around with analytics tools to discover new insights the businesses may never even have considered.
With integrated and easily accessible data, an asset business can develop meaningful Asset Health measures. And refine them as more data becomes available.
That same basic Health data can be a foundation for an asset criticality model, including data sets on network topology; potential customer disruption; likelihood of injuries or collateral damage in the event of failure; lost revenue; cost of replacement, etc.
Again, such a model can be almost endlessly refined, as long as it provides new value2. What about potential reputational damage? Or a model determining that a loss of automation features necessitates immediate action for asset X, but if the same problem occurs on asset Y, it can wait until a more cost-effective time for repairs?
Although it can be a scary step for those managing Critical National Infrastructure, such integrated data and the associated asset analytics are also the foundation for modern predictive failure models. And from there, it’s a short distance to predictive maintenance. It’s been done in other highly safety-conscious arenas for years. The US Air Force springs to mind, for example.
Think about it. If the USAF can rely on asset data to keep its engines turning and its people in the air, what more could you do with yours?
In my next blogs, I’ll explore further what becoming data-driven really means for other key utility functions such as customer services & energy retail; regulation and stakeholder management; finance and HR. Until then, I hope this has given you some food for thought.
1 They’re lying.
2 A point worth remembering. Models should only be refined and new data should only be integrated if there’s improvement to be delivered and new value to be had. If there isn’t, just stop.
About the Author
David Socha is Teradata’s practice leader for Utilities and Smart Cities, advising utilities on how they can transform their businesses through data and analytics. He began his career as a hands-on electrical distribution engineer, keeping the lights on in Central Scotland, before being a part of ScottishPower’s electricity retail deregulation programme in the late 1990s. After a period in IT Management and Consulting roles, David joined Teradata to found their International Utilities practice.