IoT Sensing and the Importance of a Time-Series DatabaseIoT Sensing and the Importance of a Time-Series Database
The chief technology officers from InfluxData and Worldsensing discuss their role in monitoring the Ponte Vecchio bridge, the evolution of IoT and other topics.
November 7, 2019
When a sinkhole opened up near the Ponte Vecchio bridge in Florence, Italy in 2016, authorities worried the famed structure could be at risk. The industrial IoT company Worldsensing stepped in to enlist the help of the InfluxData to help monitor the bridge. Working in partnership with the University of Florence and the Municipality of Florence, Worldsensing used wireless sensors to determine the bridge was safe, while on the backend, Worldsensing used a time-series database from InfluxData to look for anomalies in the sensor data.
The Barcelona-headquartered company Worldsensing began using InfluxDB, an open-source time-series database, for an unrelated application — to help monitor the performance of its server infrastructure. After its technical staff deemed it to fit their internal requirements, the scope of applications increased. “In the end, our customers also needed tools to analyze their time-series data,” said Albert Zaragoza, chief technology officer at Worldsensing in an interview at IoT Solutions World Congress.
While the leaders of InfluxData don’t tend to be directly involved with IIoT projects, they have long seen the utility in their time-series database for IoT applications. “One of the key insights I had when I started [InfluxData] company was: ‘Time series is a useful abstraction for solving problems in sensor data, industrial IoT, consumer-grade IoT, server monitoring, application performance monitoring, real-time analytics and all of these different things,” said Paul Dix, chief technology officer at InfluxData.
The partnership between Worldsensing and InfluxData represents a typical pattern for industrial IoT applications. Because of the sheer complexity, IIoT vendors frequently must draw from a variety of hardware and software tools to help clients address a given problem.
Conversely, InfluxData’s focus is building software tools for developers.
In the following Q&A, we explore the partnership between the two companies, touching on their role in helping secure the Ponte Vecchio bridge, the current state of the IIoT market, and the current shortage of data science and machine learning experts.
Albert, can you tell me about your company’s core focus and how your company got involved in the Ponte Vecchio project?
Albert: Basically, we help digitize mines, industrial infrastructures and cities. In the Ponte Vecchio example, they wanted to make sure that the bridge didn’t fall on account of the sinkhole.
We provided geological sensors and IoT data loggers connected to a gateway that ingests data from those sensors. The system sends the sensor data to an office, where they analyze the data. There are geotechnical experts there who analyze that data.
In some of the use cases, we use Influx to analyze time-series events. In IoT, many events are time series related. Depending on the use case and the customer, sensors might send data, say, every 30 seconds or once per minute. In the end, it’s all about logging time series data. We partnered with InfluxData to store that time series data and then help customers analyze it.
Paul, could you tell me more about your company’s role in IoT applications?
Paul: Most of the customers who come to us are developers. They kind of already know the things they want to do.
I define [for] the platform as basically four key areas of interest:
How do you collect the data?
How do you store the data so you can query it and process it?
How do you process it for either data enrichment or monitoring and alerting?
And then, finally, how do you visualize it so that you can summarize it for consumption?
The customers in the IoT space are largely software companies building something for IoT customers. Siemens is a customer. We partnered with PTC Thingworx. Tesla is a customer. They use it for tracking sensor data in the Power Walls, for instance.
As far as the tools for analyzing data, that’s probably the next evolution of our platform. Right now, we provide visualization tools. And we offer a query language that allows people to do basic statistical analysis. The next evolution of the platform will be providing automated tooling around correlation, machine learning, anomaly detection, predictive analytics and those kinds of things. But again, because we’re a platform, anytime we look to do something, we look for something that’s generally applicable to sensor data — basically any vertical within IoT, or server monitoring or real-time analytics.
Albert, Worldsensing has been in the industrial IoT market for more than a decade now. What kind of evolution have you seen in the past 10 years?
Albert: There’s a lot more maturity in IoT. We’ve been living with it since 2008 when the term “IoT” wasn’t common. What we were trying to do is to digitize signals nobody was trying to digitize at the time. And industries we are working with now were operating in a very manual type of way. So they were manually taking readings from the sensors.
Things have evolved steadily. Right now, we have better sensors, cheaper sensors and bigger batteries. On the edge, you have much more computing power. What’s happening now is, across all industries, people are digitizing. So the focus on the IoT right now is not on the digitizing layer anymore. It depends on the vertical, but for example, pretty much all the cities have digitalization already. The question becomes: Once you have that data, on a server or somewhere else, what do you do with it?
Paul, on a related point, could you share more about how the InfluxDB time-series database helps data scientists and other similar roles facilitate the process of data analysis?
Paul: One thing I frequently say about data scientists and machine learning experts is they actually spend very little of their time, day to day, doing actual data science and machine learning. The vast majority of our time is eating up doing infrastructure kind of work, like moving data from place to place, scrubbing data and cleaning it, and all that other stuff. So our focus generally will be to build more and more tooling to make that 90% of their work easier. The goal is to free them up to spend more time focusing on the pure data science and pure machine learning.
Given the shortage of data science experts, do you see developers stepping in to help fill the void?
Paul: In terms of developers picking up those skills, I would imagine we’ll see more and more of that over time. But generally, the most successful data scientists I know actually come from a science background, and then start to move into the developer side of things. It’s less frequent where you see people from pure computer science move on to become pure data scientists
About the Author(s)
You May Also Like