It's not just marketers who have a rose-tinted idea of what an AI strategy entails.

Brian Buntz

June 21, 2019

5 Min Read
AI strategyGetty Images

Luke Durcan, director of EcoStruxure at Schneider Electric recalls hearing an executive at an industrial company say something along the lines of: “We want to do some AI. We want to get some AI into our process as quickly as possible.”

“When?” Durcan asked.

“Probably July,” Durcan recalled the unnamed executive saying. “Yes, we want to get some AI by July.”

“You just kind of look at the guy, and then you realize that he really doesn’t understand the process. He really doesn’t understand the underlying mechanisms and requirements to get there,” Durcan said. “And the reality of it is, in an industrial context, there is no such thing as AI. It’s marketing.”

While industrial and data science experts have a variety of different opinions about vague notions such as AI and its connection to almost magical-seeming artificial general intelligence, they agree on the need for a stepwise and disciplined data contextualization and deployment of techniques such as analytics, machine learning and the like.

Atif Kureishy, who heads Teradata’s AI and deep learning initiative, frames AI as a suite of supporting techniques including analytics, machine learning and deep learning used to support a business outcome. “When you look at deep learning, for instance, which is a subset of machine learning, it’s applying neural networks, large GPU-based computation and high dimensionality of data to make increasingly accurate predictions,” Kureishy said.

As for which industries have been quickest to embrace such techniques, they are the “usual suspects,” Kureishy said, including consumer tech, financial services and insurance. Retail and telco are part of a group that is next in line, he said. In terms of manufacturing, the automotive sector is one of the fastest sectors to embrace techniques like machine learning and computer vision, given that industry’s interest in autonomous vehicles.

Durcan said the oil and gas industry is a trailblazer within process industries. “These [oil-and-gas] organizations have been investing in data, infrastructure and technology for many, many years, because it’s driven value for them for many, many years,” he explained. 

Further down the maturity curve in process manufacturing are the consumer packaged goods and the materials, minerals and mining sectors, while a number of discrete manufacturing companies such as electronics manufacturers are “pretty highly advanced,” Durcan said

So, what should lagging industrial companies do to make up for lost ground when it comes to an Industry 4.0, smart factory or AI strategy — or whichever term is preferred? And what should ones in the middle of the pack do next?

Start first with doing a self-audit and, if necessary, ensuring your organization has a robust data science grounding. Much of Teradata’s work with industrial companies includes “building the foundational aspects that our banking customers, for instance, have been investing in for the past 30 years,” Kureishy said. Many industrial companies find themselves working to understand which kinds of contextual data they have, calibrating sensors and focusing on “data science 101” aspects related to talent, tooling and their environment.

That’s not to pick on industrial companies. Last year, Gartner found more than 87% of organizations — across sectors — have low business intelligence and analytics maturity.

At an early stage, a manufacturer may have instrumented a string of sensors across its operations to better understand the conditions a material is exposed to in the manufacturing process. Once that organization can track its data in context, it can begin to spot anomalies preceding a manufacturing defect that leads to scrap. “That’s not really prediction just yet, but that’s saying: ‘Hey, I can now better characterize what’s happening in that manufacturing process,” Kureishy said. “Because I have all this telemetry data coming out and I can process and analyze it, and stitch it together, I can characterize better in a quantitative way what went wrong.”

At an early stage like this and throughout the process, Durcan stressed the importance of focusing on people and process as well as technology. “Within a typical brownfield facility, there are people who have been there for 20, 30 or 40 years who probably know a lot more about the process than you ever will,” he said. “And then there’s the process itself, which again, is an evolution over time. So, you’re going to find ways to integrate your technology into the people–process environment to deliver incremental value.”  

Industrial organizations that have invested in building a solid data science basis can then begin to explore the potential of more-advanced techniques like neural networks. And as their maturity advances, they can move from characterizing what is happening in their operations to correlating the variables arise in that environment and eventually establishing causation between variables. “That’s saying: ‘When A happens, B occurs, so I know C is going to materialize,” Kureishy explained. “That gets you in a better stance of prediction. You can start to say: ‘I’m starting to see these anomalies. If you don’t intervene at some level, then I know that this condition C is going to occur.’”

The next tier extends the sophistication. “You can make a really distinct recommendation in a prescriptive manner to fix or optimize a process,” Kureishy added. The top tier is where this entire process of spotting anomalies and addressing them before they cause bigger problems is completely automated. “We’re talking Terminator,” Kureishy joked.

Industrial organizations should avoid concluding that their AI strategy journey has a definite end destination. “There’s no nirvana over the hill,” Durcan said. “This just keeps on getting more and more complicated.”

Finally, industry leaders should understand “integration of data is paramount, but data on its own is just the beginnings of a predictive model and an analytic model,” Durcan added. It is vital such professionals understand their asset hierarchy, asset model and asset context. “Then you can start to build a better in-depth detailed information about the data flow and data infrastructure around your organization,” Durcan added. From there, they can leverage the data for descriptive visualizations and operational reaction. “That’s what 90% of the people are going to use it for,” he added. “But you have to take the first step on the journey to get there.”

About the Author(s)

Brian Buntz

Brian is a veteran journalist with more than ten years’ experience covering an array of technologies including the Internet of Things, 3-D printing, and cybersecurity. Before coming to Penton and later Informa, he served as the editor-in-chief of UBM’s Qmed where he overhauled the brand’s news coverage and helped to grow the site’s traffic volume dramatically. He had previously held managing editor roles on the company’s medical device technology publications including European Medical Device Technology (EMDT) and Medical Device & Diagnostics Industry (MD+DI), and had served as editor-in-chief of Medical Product Manufacturing News (MPMN).

At UBM, Brian also worked closely with the company’s events group on speaker selection and direction and played an important role in cementing famed futurist Ray Kurzweil as a keynote speaker at the 2016 Medical Design & Manufacturing West event in Anaheim. An article of his was also prominently on kurzweilai.net, a website dedicated to Kurzweil’s ideas.

Multilingual, Brian has an M.A. degree in German from the University of Oklahoma.

Sign Up for the Newsletter
The most up-to-date news and insights into the latest emerging technologies ... delivered right to your inbox!

You May Also Like