IoT in Utilities: A Look at Future ApplicationsIoT in Utilities: A Look at Future Applications
Innovation lag is holding the industry back from future changes, leading to the need for reliability and security when it comes to IoT in utilities.
August 20, 2019
Utilities customers expect the television to turn on, clean water to gush from the tap and the furnace to kick in if the temperatures dip.
Those unceasing demands for reliable service, however, could be holding the utilities industry back from innovating for the future, analyst said.
“Keeping the power flowing isn’t a trivial responsibility,” said Neil Strother, principal research analyst with Navigant. “Utilities are not geared to the speed of a startup or Silicon Valley. They’ve made a commitment to keep millions of lights on.”
Matt Schnugg, senior director of data and analytics, ML/AI at GE Power Digital, agreed, adding that conservative investment policies and unwieldy governance structures can also hinder innovation.
As a result, utilities “may not have evolved as quickly as the technology has innovated or the business model has evolved,” he said.
Along with financial and governmental institutions, utilities are among the slowest industries to move to the cloud, analysts said. Mounting cyberattacks demand vigilant security protocols and there’s a strong tendency to want to keep data and compute nearby.
Wide-scale cloud adoption by utilities is unlikely, at least in the foreseeable future, Schnugg said. “There are always latency issues with processing and the need for real-time analysis, so those servers are generally located on prem for use in the control room.”
In the meantime, the balance of compute will likely shift toward the edge given the proliferation of connected devices and potential for fully automated intelligence sharing.
“I can send off thousands of cores running an algorithm with 15 years of data on the Eastern sea board, scale up, and back,” Schnugg said. “You can’t achieve that on premises from a cost-efficiency perspective.”
Cloud-based platform solutions, however, that ingest data from multiple nodes and third parties can provide vital intelligence, said Lance Brown, vice president, customer service solutions for Smart Energy Water, a SaaS and analytics provider in the utilities industry.
For example, you can analyze data from hundreds of utility users in an area to gauge average consumption, identify anomalies in usage because of leaks and suggest opportunities for conservation programs or demand response buying to eliminate strain on the grid, said Brown, formerly the director of customer service with Los Angeles Dept. of Water and Power. “It’s all scalable and we can eliminate real waste.”
The American Society of Civil Engineers’ most recent infrastructure report card indicates that 6 billion gallons of treated water are wasted every day because of leaking pipes, a problem that sensors and analytics can help reduce significantly.
Also, billions of dollars of investment are still needed in aging and “unintelligent” utility infrastructure over at least the next decade, in part to collect data that will fuel AI and machine learning, according to the engineers.
Tremendous value can be unlocked from data to plan for infrastructure investments or required skill sets, information that is greatly informed by what happened in the past, Schnugg said.
“For a long time, our utilities have done a great job being able to manage this given the depth of experience of individuals in the workforce,” he said. “But as it ages out, we’re hoping to help codify that experience into software.”
Strother also sees workforce challenges with IoT in utilities and the trend toward digitalization. “It’s tough,” he said. “People studying AI and machine learning, they aren’t rushing to apply their skills at the utilities. You’re competing with Google and Facebook and Amazon and doing robotics.”
More Data, More Errors
Paradoxically, as the utilities generate more and more information, less of it can be trusted, Schnugg said.
“What if a model could be created to run through a series of data checks automatically, and not only surface the inaccuracies, but plug those back in?” he said, adding that GE’s new analytics tool can help reduce uncertainty in network data.
The goal is to reach a point where people don’t have to check through data themselves, or file a ticket for analytics, or literally drive out to see if a sensor is on a particular power line, he said.
Hopefully, he added, at some point in the future, every utility will be able to recreate any event in its grid from different perspectives because data has been appropriately stored and modeled, can be efficiently recalled, and the analysis can be immediately plugged into operation afterward. He also sees a day when data will be used en masse to predict future outcomes.
Currently complicating the ability to synthesize data, Strother said, is the growth in renewables and resulting two-way flow needed when customers with solar PV and wind power return energy to the grid.
“The business model and economics are changing,” he said. “It’s getting more complex. The challenge is how to figure out all the above and be good stewards of the environment.”
To deal with that complexity, Schnugg said he sees increased engagement among utilities, national power researchers and think tanks. He mentioned GE’s R&D partnership with Pacific Northwest National Laboratories.
The lab, under the Department of Energy, studies among other things smart grids that enable two-way transactions of power, machine learning and analysis tools, and the sharing of cyberthreat information among utilities.
“Data will continue to converge,” Schnugg said. “You have to very careful. When you create the most interesting data set on the planet, it’s interesting for the bad guys as well.”
As cyberthreats between the United States, Russia and Iran increasingly appear in the news, PNNL and its partners in the Cybersecurity Risk Information Sharing Program are using “high-performance compute and machine-learning to spot trouble in the vast sea of data generated by IoT devices and sensors,” according to the lab. “By quickly identifying trends and relationships that may reveal a potential threat, grid operators and automated protection systems can rapidly take action to protect the grid.”
Are utilities open to sharing data that could lead to cost savings for themselves and consumers?
Schnugg said acceptance is growing, but it’s currently more popular in Western European countries that are more progressive in their thinking and have consortiums to monitor wide areas of the grid.
“They recognize that more data is generally better,” Schnugg said.
Not so long ago, utilities had few incentives to be on the cutting edge of innovation. They valued uptime, security and safety as they should have, the analysts said, and there largely wasn’t any competition.
“Now, innovation needs to happen fast,” Schnugg said. “Business models are changing. There’s more decentralized energy and there’s a need to adopt new technology to drive it.”
Whether IoT and analytics produce useful and actionable information, or address the need for new energy sources is yet to be seen. “The answer is complicated but most certainly yes,” said Chris Moyer, senior director of content and research for Zpryme, an energy-focused researcher based in Austin, Texas.
“The power of AI and ML can speed up the power of data computation and testing for technology related to solar PV, storage, and potentially,” he said, “even fission.”
About the Author(s)
You May Also Like