IoT Data: Strategic Sharing Will Yield Big Innovations
Don DeLoach is a data guy by heart. It’s fitting that the former CEO of Infobright, a database software firm recently acquired by Ignite Technologies, has recently co-written a book in which data plays a starring role. Titled “The Future of IoT: Leveraging the Shift to a Data Centric World,” the book fleshes out a vision for the future of IoT-enabled enterprises.
The crux of the book is that enterprise companies can get the most out of Internet of Things (IoT) data by contextualizing it and strategically sharing it with other organizations. Here’s an example: Let’s say an individual Subway franchise leveraged data on pedestrian traffic, street traffic, weather, equipment availability, and prior sales to fine-tune its business operations. The franchise could, for instance, anticipate when a surge of customers is likely to come in through its doors on a Friday evening, and adjust its inventory and crew scheduling to prepare for it. And the larger Subway organization could use such information—across thousands of its franchises—to inform their unique buying, scheduling, and capital equipment investments.
Before working on the book, DeLoach began to chart out hypothetical IoT use cases like the one above across a range of industries. The more he reflected on IoT case studies, the more he saw the power of data in context.
The value of IoT data increases dramatically when a variety of organizations agree to share relevant information with one another. It’s a situation where the whole is greater than the sum of its parts, DeLoach says. A company selling a jet engine as a service to an airline, for instance, can help the airline optimize fuel consumption by leveraging data coming off of sensors in its engines. But the jet engine maker could further improve fuel consumption by weaving in relevant data coming from the avionics system in planes using its products. And by extension, the company maintaining the avionics could optimize the operation of that technology with jet-engine data.
This principle of data sharing can extend further, DeLoach says. “I want product providers to get their stream of data and also my supply chain data to get that data and the regulatory oversight groups to get access to relevant data,” he explains. “I want to spend less money providing OSHA with what they need, and I want them to do a better job. Everybody wins if you get the right data to the right constituent at the right time.”
In the “Future of IoT” book, which DeLoach coauthored with Gartner analyst Emil Berthelsen and Wael Elrifai, an executive at Pentaho (a Hitachi division), he illustrates how the most advanced IoT implementations will not be single optimized products but systems of systems. In the first IoT phase, connected sensors help monitor how a product operates. It then can use that data to boost efficiency and productivity.
DeLoach illustrates this progression with the example of an IoT-enabled Coke machine. The first IoT phase was to outfit the Coke machine with sensors to monitor when it needs refilling, the internal temperature, and so forth. In the next stage, an IoT-enabled Coke machine could do everything it could before, but with considerably more sophistication while also offering new features. For instance, Coke could take the sensor data from its network of vending machines and enrich its data related to weather, traffic, demographics, and more. Instead of the Coke machine sending a message essentially saying: “I’m at 10% capacity. Please fill me up with soft drinks,” the machine could say: “I project that I’ll be at 10% capacity at 8 a.m. on Friday. That would be in the middle of rush hour, however, and I see that a storm is forecasted to arrive then as well, so it would likely be more efficient to send a delivery driver at 1 p.m. on Thursday.”
“You only get benefits like that by looking at the data contextually at a broader level,” DeLoach says. While it’s true that making sense of massive troves of IoT data can be a sizable challenge, the most successful organizations will be those that capture as much data as possible, letting utility dictate what to do with it, DeLoach explains. Edge computing can improve the signal-to-noise ratio while an event-driven, publish and subscribe data architecture can help ensure that the right data gets to the right place at the right time. “You start with capturing and securing it in a master repository,” DeLoach says. “But then you think about the data’s context in the larger environment, and how you can enrich and propagate it.”