As companies start using IoT more and more, edge computing is helping them gain more real-time analytics for faster decision-making.

Daniel O'Shea

March 7, 2019

9 Min Read
Image show a technology and innovation concept
Getty Images

In enterprise IoT architectures, cloud computing can play a significant role in organizing, storing and analyzing a variety of data collected from IoT sensors, However, as companies start to use IoT for a broader array purposes, including management of mission-critical or industrial applications, there will be a growing need for real-time analytics and faster decision-making to occur directly on the IoT device, or somewhere closer to the device, .

For example, an autonomous car or a critical piece of equipment in a partially automated factory may require instant, almost zero-latency response to detected conditions, as human lives or business livelihood, may hang in the balance. To enable necessary action to be taken more quickly–perhaps autonomously–companies deploying IoT for such purposes increasingly may turn to the concept of edge computing.

Generally, edge computing is the concept of processing data on the connected device, or as close to that device as possible, but the term has no universally agreed-upon definition.

Phil Ressler, CEO of IoT sensor platform company Sixgill, said, “Edge computing is the notion of processing data and conducting analytics on the endpoint device itself, or somewhere else at the edge of the enterprise network, as opposed to sending the data out to a data center.”

David Linthicum, chief cloud strategist at Cisco Systems, shares that view. “By eliminating the distance and time it takes to send data to centralized sources, we can improve the speed and performance of data transport, as well as devices and applications on the edge,” he wrote in a blog post.

Telecom service provider AT&T put a slightly different spin on it in a 2017 white paper, describing edge computing as “placement of processing and storage capabilities near the perimeter (i.e., “edge”) of a provider’s network.” That definition suggests processing occurring not on the enterprise premises, but at a location in carrier’s network, such as a 5G cell tower close to an enterprise customer, a take that befits AT&T’s mission to serve enterprises as a provider of managed services. (AT&T, according to a Light Reading report, is planning to test enterprise use cases for edge computing later this year. The same outlet reports Verizon recently tested edge computing on its 5G network in Houston and determined that edge helped them cut latency in half.)

Sixgill’s Ressler, acknowledged, “There’s not a binary definition of where the edge begins and where the edge ends.”

There are other terms floating around in the edge computing stew, such as edge cloud, which is similar to AT&T’s definition; fog computing, which was spun by Cisco as a way of describing a standardized model for edge computing; and mist computing, the new-age-sounding notion that puts processing in between the cloud and edge — within a sensor, for instance.

If all of this sounds more like the decades-old concept of distributed computing — with Ethernet and ARPANET being prominent examples, there’s a reason for that. Edge computing is one of the latest ways of applying distributed computing philosophy and practice, according to Eric Simone, CEO of ClearBlade, a company that has been talking up the benefits of edge computing for several years.

Simone, who described his definition for edge computing as more restrictive than most, said the term should apply only to processing happening “on the device that generated the data, or that needs that data.” Presumably, data process that takes place at a nearby gateway or edge device would be classified as near-edge, according to Simone.

“My definition of edge computing is if you have a solution that runs on the cloud or elsewhere, and then you can take that solution and sync it completely with a device in the car, on a train, in a factory, and run it there,” he added. “That means standardization of the way you transmit not just data, but the entire system.”

That description edges — pun intended — in the direction of the concept of firmware, the software that enables basic hardware operations on a wide variety of devices. He said edge computing, however, is different, having more to do with the processing of data to run application-level solutions. Although, evolving device firmware, computing power and memory is complementary to the practice of edge computing. Ultimately, he acknowledged that concepts such as edge cloud or “near edge” computing, firmware and others all deliver their own value — they just aren’t edge computing.

Why Edge Computing Is Hot

While the definition of edge computing may be fluid most agree on the benefits it provides within enterprise IoT deployments, and why it makes sense for those IoT architectures to move in this direction. Gartner anticipates that by 2025, 75 percent of data processing will move to the edge — up from 10 percent in 2018.

It remains to be seen if edge computing adoption will grow so quickly, but the technology’s most important benefit is the ability for real-time decision making. “You don’t have any latency factors other than those on the device itself,” said Sixgill’s Ressler, who added that the escalating movement in mission-critical and industrial IoT applications requires operational decisions to be made on the sensors themselves “in a sub-second or millisecond time frame.”

While 5G mobile broadband connectivity increasingly is being positioned to help IoT devices achieve latency of no more than 1–2 milliseconds between devices and clouds, it still carries the potential for a connectivity failure at a critical moment.

Similar thinking applies to the benefits of edge computing in an IoT environment where the devices are remotely located or distributed over broad geographies. This describes everything from construction sites to autonomous vehicles to railroads to mining and other transportation and logistics fleets.

“There’s a benefit if the device is remote and you’re able to deploy the data application functionality on the same device that is collecting the data,” Ressler said. While connectivity is very important in these use cases, he said edge computing on the device itself means that if that connectivity is intermittent, the IoT-connected device “can operate on its last known information state,” to enable key decisions to be made.

Gary Brown, AI product marketing director at Intel, said the ability to conserve overall power may improve through the use of edge computing, as transferring data to cloud servers or edge cloud servers can require more power consumption than keeping it on devices. “Depending on the amount of computing you have to do, you can do it in the device,” he said. “That translates to faster results, lower latency and lower power consumption.”

AI and the Computing Continuum

Edge computing can help companies migrate to a future in which some latency-sensitive IoT applications, such as those involving autonomous vehicles or management of critical infrastructure, more often will require instantaneous decision-making. It’s part of what will help IoT deliver on all the hype it has been saddled with.

“IoT can help us realize the true value of the Internet,” ClearBlade’s Simone said. “What we have done the last 25 years or so has been great [with social media, e-commerce and customer relationship management innovations], but it’s low hanging fruit. What’s going to a be a big deal is the next 25 years is when we see trains, cars, manufacturing systems and critical infrastructure get updated and modernized because of IoT.”

Edge computing is just one piece of the puzzle, however. Artificial intelligence is another, enabling the detection and recognition of patterns or changing circumstances to be processed at the edge. AI silicon could be used in endpoint devices or in edge nodes, such as gateways, to allow AI inference to help devices–either autonomously or through their human operators–use data-driven insights to take the right course of action in managing an application.

One example of how this would work, Brown said, is an automated retail store operation, in which cameras and other sensors track availability of inventory, shopper handling of inventory, and a cashier-less or mobile-enabled checkout process. AI can be used in the edge devices to make sure that an item is added to the customer’s bill when placed in a shopping cart, or deleted if it is removed from the cart. Detection of low inventory can trigger reordering or restocking.

In different kind of example, the edge devices could be connected cameras in a smart city that identify the amount of traffic congestion in an intersection. These devices could be connected to nearby appliances that aggregate cameras and run video analytics across multiple cameras. And those appliances are connected to the network where intelligence can be aggregated and applied to controlling traffic signals to better manage congestion throughout the city.

“You need to have a powerful neural network that can do all of the detection and classification in that environment,” Intel’s Brown said. “The data center has always had a lot of AI, but now you see it at being used at the edge. AI at the edge now may be growing faster than AI at the data center [according to data Brown attributed to Intel, IDC and Gartner].”

That doesn’t mean that the need for cloud computing in enterprise IoT networks will dissipate. Brown and Ressler both referred to “the continuum of computing resources” that allows companies to make adjustments in how they manage their computing needs.

“If the edge device if it has enough memory, you can process and archive data there, and it can be uploaded to the cloud later when it’s convenient,” Ressler said. “You also could use edge computing to make decisions on the device in real time, and also be sending some data to the cloud to be analyzed, but not relying on the cloud for any real-time decisions.” That strategy can help organizations reduce their cloud bill, and also can be effective in use cases where intermittent connectivity is an issue, such as at construction sites or mining sites.

Brown added, “It’s not a ‘versus’ situation.” Cloud computing and edge computing are complementary.  “The reality is the amount of processing that people want to do is going up and up and up. You need that continuum of processing power.”

Sign Up for the Newsletter
The most up-to-date news and insights into the latest emerging technologies ... delivered right to your inbox!

You May Also Like