Is Your IoT Architecture Keeping You in Pilot Purgatory?
Editor’s note: Don DeLoach is the president and COO of CENTRI Technology and an advisory board member at Microshare.io. Don is also the coauthor of the book “The Future of IoT.” Don also serves on the Executive Board of the Illinois Technology Association, is co-chair of the ITA Internet of Things Council, organizer of the Midwest IoT Summit.
We at the Illinois Technology Association are in the final stages of planning our sixth annual Midwest IoT Summit. While hardly scientific research, the process of creating the agenda and reaching out to potential speakers for the event has hammered home the point that there are still mountains of IoT pilot projects, yet not that many production-scale IoT deployments. This conclusion is backed up by World Economic Forum and McKinsey’s notion of “pilot purgatory,” a term described long-term experimental IoT projects that fail to roll out at a production scale as a result of “inability” or “lack of conviction.”
While there are vital considerations related to people and process that can prevent Internet of Things projects from reaching production scale, one often overlooked dimension is IoT architecture.
To understand why, let’s take a look at the evolution of the Internet of Things itself, drawing on Michael Porter and James Heppelmann’s seminal article “How Smart, Connected Products Are Transforming Competition.” In their framework, smart products are not new but simply refer to products with microprocessors, sensors, software and frequently an embedded operating system. To control a traditional thermostat required you to physically interact with it. But you could program a smart (but unconnected) thermostat for different times of the day. In theory, you could also connect such a thermostat to a motion sensor and trigger that thermostat to turn itself off if it hasn’t detected movement for, say, 30 minutes. Taking this smart thermostat and adding connectivity, however, adds a whole new dimension in the form of internet access and back-end processing. With a Nest or equivalent thermostat, you can check the temperature of your house and control the device remotely. You can program a smart fan to kick on when the temperature rises above a defined temperature. In addition, a handful of utilities offer consumers energy discounts for using these types of devices as the technology can allow them to better manage the electrical grid load during peak demand times.
Many other smart, connected devices are fundamentally similar in terms of this progression of capabilities and possibilities. And as a result, product providers have established back-end processing capabilities to digest the data coming from such smart, connected products and perform subsequent analytics. In many cases, they included a smartphone or web-browser-based interface.
In many respects, this evolution from physical or manual-based devices, to smart devices, to smart and connected devices was monumental. But from a technological standpoint, it frequently creates walled gardens. In many cases, a smart, connected product creates data and then is in charge of consuming it. This model has been around for decades. In the 1970s, for instance, a demand deposit system used in banking generated data that could only be used by that system. Such data existed in hierarchical databases like IBM’s Information Management System. At the time, we called them “silos,” without thinking much of it. But, in 2018, if you characterize a technology implementation as a “silo,” it’s probably not meant to be complimentary. But for many IoT pilots, the term fits. You pilot a smart lighting system, a smart milling machine, a smart facial recognition system, and the data you can use is what the manufacturer decides you can have.
If you launch a pilot with a siloed IoT architecture, you might end up learning how IoT systems work, but the project isn’t likely to demonstrate significant business value. For that to happen you would need to contextualize the data your IoT devices gather via your other enterprise and even external third-party data.
Conversely, if you deploy IoT architecture in a manner that allows you to separate the creation of the data from the consumption of it, that’s where the equation begins to change. You can focus more on the utility of the data rather than where it lives or who owns it.
There is a chance that a pilot with a siloed data architecture can move closer to production while still remaining trapped in “purgatory.” You could also call it a “production pilot.” For instance, let’s say you tested a facial recognition system and it proved successful enough to convince you to roll it out to 200 stores. But the data it generates is still siloed, even though the system is in production. To get the system to full-scale production could happen once you have integrated the data from the facial recognition system with the CRM, point of sale and other enterprise systems. But if the data starts out siloed, that’s a hard transition to make. Over time, business realities will make it imperative for production pilots to move toward enterprise integration. While it’s happening slowly, the good news is it actually is happening.
One piece of evidence that this progression is happening is the industry’s increased focus on edge computing, which is analogous to the “first receiver” architecture described in the book I co-authored titled “The Future of IoT: Leveraging the Trend to a Data Centric World.” Thanks to edge computing, the end-point sensors on various devices are increasingly being consolidated and contextualized, filtered and enriched at the edge near the point of creation, where that data can be packaged and delivered to a variety of constituents as needed. Good examples of this progression are the work being done by the EdgeX Foundry, the OpenFog Consortium or products like Amazon Greengrass, Microsoft IoT Hub, Intel IoT Hub, or companies like Clearblade, MachineShop, Foghorn and others. This is the power and associated leverage that comes from the right architecture.
Six years ago, when we began the Midwest IoT Summit, there were a small number of pilots and almost no production instances of IoT systems. As we look toward our sixth annual summit, our theme this year is “Pilots to Production.” We will be exploring the technical and non-technical reasons that are both inhibiting and enabling the progression from pilots to production. IoT is inherently a holistic proposition, where the foundational elements need to be understood and integrated thoughtfully to truly be effective.
The greatest value of IoT is a function of creating and accessing the right data as input to the right analytics that can provide for the right outcomes. Getting to that (production) point is a function of the right architecture, and impacts decisions about sensors, communications, security, privacy, analytics and certainly the data ownership and governance. The summit is two days. The first day will focus on the horizontal foundational elements (inclusive of architecture as well as people and process), and the second day will then look at those elements from the standpoint of vertical deployment. The summit, like the market, has been maturing year after year. We have learned a great deal, but also understand there is much left to absorb and much left to do. That said, this seems like an ideal time to take stock in what we have learned about leveraging IoT as we move from pilots to production.