Making sense of streaming data in the Internet of Things

by Daniel Teachey, Insights Editor

The Internet of Things (IoT) is evolving from a term familiar to just die-hard technology wonks. It’s now a reality for consumers. From wearable fitness trackers to connected appliances, we are buying and using devices that spit out data that gets collected. Somewhere.

Now, the things that seemed like science fiction 20 years ago (Self-driving cars? Refrigerators that send a text when you’re out of milk?) are rapidly approaching the market. Just the amount of “things” in the Internet of Things is impressive; Inc. magazine estimates that more than 26 billion devices will be connected by 2020.

The critical challenge is using this data when it is still in motion – and extracting valuable infor­mation from it.

No, IoT isn’t merely a buzzword. It’s where we’re headed.

But, first, let’s examine what all this means. While everyone is focused on what people can “do” with this data, this leaves out a significant step. Before you do something, you first have to decide what to do and when to do it.

This is the essence of analytics. And today’s device-driven world is forcing analytics to occur as fast as the data is generated.

In a new white paper, Frédéric Combaneyre, a business solutions manager at SAS, outlines how you can use event stream processing (ESP) to make sense of the data streaming from the Internet of Things (IoT). More than a collision of acronyms, this is one emerging technology developing to solve the problems of another.

The excerpt below outlines the origins of IoT, and where the data is coming from that will serve as the foundation for streaming analytics in the future.

The early world of sensors

The first sensors appeared decades ago, and while they have a long history, these devices entered the popular nomencla­ture more recently thanks to the Internet of Things. A sensor detects events – or changes in quantities – and provides a corre­sponding output, generally as an electrical or optical signal.

Today, sensors are used in everyday objects such as touch-sensitive elevator buttons and lamps that dim or brighten by touching the base. Sensors are also heavily used in manufac­turing, medicine, robotics, cars, airplanes and aerospace.

The largest sensor challenges occur once signals have been detected. At that point, you have to decide:

  • Where do I collect the data being generated?
  • How can I use it?

To capture and collect the signals coming from sensors, the operational historian emerged. The operational historian refers to a database software application that logs and stores historical time-based data that flows from sensors. These data stores are optimized for time-dependent analysis and are designed to answer questions such as, “What was today’s standard deviation from hourly unit production?”

Historian technology captures data from sensors and other real-time systems, and it often uses manufacturing standards and contains interfaces for several hundreds of sensor types. These dedicated data historians are also designed to survive harsh conditions, such as a production floor, and they feature the ability to continue capturing and storing data even if the main data store is unavailable.

Historian software technologies have developed complemen­tary tools to provide reporting and monitoring features to detect trends or correlations that indicate a problem. They can then alert an operator to take immediate action before equipment damage occurs.

Until recently, that was the state of the art for generating value out of sensor data.

The Internet of Things – and big data explosion

Since 2012, two major changes have shaken the sensor world – and caused the IoT market to mature more rapidly than before:

  • Sensors shrank. Technological improvements created micro­scopic scale sensors, leading to the use of technologies like like Mircroelectromechanical systems (MEMS). This meant that sensors were now small enough to be embedded into unique places like clothing or other materials.
  • Communications improved. Wireless connectivity and communication technologies have improved to the point that nearly every type of electronic equipment can provide wireless data connectivity. This has allowed sensors, embedded in connected devices, to send and receive data over the network.

Whether it’s a car or a pacemaker, the data from sensors is flowing in a constant stream from the device to the network – and sometimes back to the device. This is leading to massive amounts of data, and as a result, IoT is seen as a major contrib­utor to the era of big data.

While organizations today are investing heavily in capturing and storing as much data as possible, the critical challenge is using this data when it is still in motion – and extracting valuable infor­mation from it. Organizations are (or will soon be) scrambling to apply analytics to these streams of data before the data is stored for post-event analysis. Why? Because you need to detect patterns and anomalies while they are occurring, in motion, in order to have a considerable impact on the event outcome.

To learn more about how event stream processing can make a difference for your IoT initiatives, view the white paper “Understanding Data Streams in IoT.”


Get More Insights


Want more Insights from SAS? Subscribe to our Insights newsletter. Or check back often to get more insights on the topics you care about, including analytics, big data, data management, marketing, and risk & fraud.

Back to Top