Experts Cautious About 5G’s Impact on IoT Data Architectures

While 5G promises to change everything, experts urge caution about its implications on IoT data architectures.

Crystal Bedell

December 6, 2019

4 Min Read
Image shows a female engineer working.
Getty Images

Fifth-generation cellular wireless, or 5G, promises to change everything — most notably, the Internet of Things. While organizations are buying into the hype, experts are a bit more cautious about 5G’s implications on IoT data architectures. 

While individual results will vary based on a host of factors, the general consensus is that 5G will significantly increase download speeds — up to 10 gigabits-per-second — and decrease latency. “5G reduces latency by almost a tenth, so that’s what gets you into the 4- or 5-millisecond range over the air,” said Arpit Joshipura, general manager for networking, IoT, and edge computing at the Linux Foundation.

A 5G use case and adoption survey by Gartner Inc. indicates that organizations are eager to realize 5G’s performance improvements, with 66% of organizations planning to deploy 5G by 2020. IoT is the most popular target use case for 5G with 59% of the organizations surveyed expecting 5G-capable networks to be widely used for this purpose. 

“5G is uniquely positioned to deliver a high density of connected endpoints — up to 1 million sensors per square kilometer,” said Sylvian Fabre, senior research director, Gartner. 

According to Joshipura, “We will see the mass deployment somewhere in the 2020 time frame. Two important use cases are industrial and enterprise services. Those are all going to be in dense city areas.”

Re-Evaluating the Edge Versus the Cloud

In the meantime, the ability to move more data, faster raises a number of architectural considerations.

“Edge computing is more valuable where you have limited connectivity because you’re shifting the analytics process out to the end to save bandwidth and make it more efficient. If you have limitless bandwidth, you could argue that you won’t bother with edge computing because everything would stream to the center,” said Zak Doffmann, founder and chief executive officer, Digital Barriers. “Had 5G been around five years ago, there wouldn’t have been a push to go to the edge. But we’ve put AI in hardware at the edge, and smart sensors are able to pull intelligence from the data before it’s sent anywhere.” 

Paul Bevan, research director, IT infrastructure, Bloor Research, asserts that there is still value in performing analytics at the edge. “The impact of 5G in terms of analytics has been overplayed,” he said, regarding the claim that processing will be almost eliminated from the edge. “That’s not going to happen for too many reasons: performance, latency, security. The reality is that the automation of the IT infrastructure and the ability to put completely dark operations at the edge will make it easier for organizations to think about having completely automated operations at the edge and therefore do more processing there.”

Joshipura agreed. “As a radio technology, 5G can carry a lot more data, which means if you can process all your data locally then you don’t need to carry it back to a centralized data center in the cloud. I would say 5G allows you to move the entire concept of a cloud from centralized data center to distributed edges. You are less dependent on the centralization because you’re effectively doing locally distributed processing.” 

Taking It Case by Case

Regardless, Doffmann said, “There will be far more thought given to where you put your AI, which is of major value in IoT.”

That determination will not be focused solely on cost. “The issue about where you process the analytics or where you process the information is much more around two areas. First, it’s a latency issue. How quickly do you need the sensor piece to work? Second, how much data is actually needed back at the center for broader analytics?,” Bevans said. 

He continued: “When talking about longer scale projections, a lot of what we see is that a local edge-based environment needs the analytics and needs sensors to respond, and then you go to the cloud if you need a level of machine learning to get better at the automation. It’s a data architecture that will drive how much goes backwards and forwards, and what a business needs in terms of latency.”

The Bottom Line 

All that said, questions of cost still arise. “Large operators have started hinting at tiered pricing based on bandwidth, latency and SLAs in terms of service time,” Joshipura said. “That would mean additional revenue for telecos, and from an enterprise perspective, it would move some workloads and apps from a centralized cloud to a distributed cloud.” 

The result? “For enterprises, the cost might be a wash, but the services and the experience will improve for the same application. As enterprises come up with more applications, it will cost them more money,” Joshipura predicted. “There will be a top-line increase from an end user perspective, while distributing OPEX and CAPEX below it.” 

About the Author(s)

Crystal Bedell

Crystal Bedell is a freelance tech writer and B2B content marketing consultant with nearly two decades of experience. She works with technology solution providers, managed services providers, publishing companies and agencies specializing in B2B tech.

Sign Up for the Newsletter
The most up-to-date news and insights into the latest emerging technologies ... delivered right to your inbox!

You May Also Like