How design thinking will re-frame IoT analytics prototyping

0

Data scientists are familiar with prototyping - we do this as part of a well-understood process to arrive at the optimum solution. But prototyping is more than about perfecting the math; it should also be about testing interpretability of the results. 

This need comes into stark prominence when you start thinking about the Internet of Things (IoT), and the streams of new data that will be flowing, as well as the much larger community of employees, partners and customers who will be interested in selected analysis of this data.

Putting the spotlight on user experience 

IoT success We’ve been referring to design thinking as a way to conceptualize the underlying IoT-driven change. Design thinking gives a natural focus for prototyping and use of big data: user experience. This can be helpful where there are many possible areas for focus, as it guides the process towards what really matters.

This is particularly pertinent for big data and analytics, because these areas often require innovative thinking, and consideration of new areas. The sheer volume of data can make it difficult to know where to start looking for insights. Design thinking, with its focus on user experience, often brings a new perspective to the table.

The focus on users has another benefit: It’s intrinsically democratic. It breaks down the split between ‘them’, the users, and ‘us’, the experts, by making clear that the user perspective is the most important element in design.

It therefore also has an effect on hierarchy within companies, and helps foster a more creative, and perhaps even experimental atmosphere, by making clear that everyone has equal potential to input effectively.

Why IoT needs design thinking

In many ways, the Internet of Things and design thinking have a potentially symbiotic relationship. The best way to experiment and prototype is with lots of data. In fact, the more data, the better, as this allows more potential to generate insights.

The actual quality of the data may be less important, at least in the first instance: raw data is perfectly acceptable for experimenting and prototyping. This may be a prime example of quantity trumping quality, although cleaning the data and improving its quality is likely to be necessary later, once beyond the initial prototyping.

Experimentation and prototyping requires a tolerance of failure. After all, if you knew something was going to work, you wouldn’t build a prototype in the first place. This tolerance of failure needs to run right through the company, as it’s likely to be a regular feature of any new approach, including design thinking, and therefore requires cultural change.

That said, it’s entirely possible that design thinking will actually make failure less likely, because it allows insights into customer thinking and therefore to what is likely to succeed.

“Minimum required for success” is more efficient than “perfect.” Part of a tolerance for failure is the need to work swiftly, and not seek perfection at the expense of speed.

As the Pareto principle states, 80 percent of the work can be done in 20 percent of the time. With prototyping, it is crucial to stop at the 80 percent point, and not waste time on the remaining 20 percent.

The last 20 percent is probably not essential for functioning, and certainly not at the prototyping point. It’s also the enemy of speed. Seeking perfection may mean missed opportunities to try something at the right moment.

Where to start

  1. Define the business question. It’s OK if it changes later, based on evidence and user experience, but it starts the process in a focused manner.
  2. Mine the data. Users probably won’t tell you about issues; but use the data to find the successes and problems.
  3. Review, revise and re-iterate.

You can read a perfect example on how to exploit data coming from the IoT to drive revenue, cut costs and innovate in this IndustryWeek Special Research Report: The IoT: Finding the Path to Value

Share

About Author

Colin Gray

Colin started his career training to be an actuary and holds a Certificate of Actuarial Techniques. Since moving to SAS, he has concentrated on the detection and prevention of fraud through the use of Analytics.

Comments are closed.

Back to Top