An executive’s guide to cognitive computing
By Alison Bolen, SAS Insights Editor
From self-driving cars to personal assistants, we’ve seen that machines can already read, write, speak, see, hear and learn. But the big question in cognitive computing is: Can they understand? For a machine to be truly intelligent, it’s not enough for it simply to know the words you’ve said. It needs to know what you want and be able to provide assistance in context.
“Almost every customer I’ve talked to this year wants a briefing on cognitive computing,” says Oliver Schabenberger, SAS Vice President of Analytic Server R&D. “You read articles that tell you we’re going to go from facial recognition to the machines taking over, but really that’s not the case.”
Instead, he says, cognitive computing includes a broad group of technologies like artificial intelligence, machine learning and natural language processing software that are converging to provide assistance to businesses and individuals. Schabenberger provides this definition:
Cognitive computing is based on self-learning systems that use machine-learning techniques to perform specific, humanlike tasks in an intelligent way.
That’s concise, but it includes a few key points that we should explore a bit further:
- Self-learning means the system receives initial instructions, but after that it pretty much learns on its own based on the data you continue to feed it.
- Machine-learning techniques automate model building to iteratively learn from data and to find hidden insights without being explicitly programmed where to look.
- Specific, human-like tasks means the system can classify and understand objects and recognize human languages, but the tasks it performs are highly specialized. A system that is designed to drive your car cannot change your oil or clean your garage.
- In an intelligent way describes how the system is able not only to understand input such as text, voice or video, but also to reason and create output consumable by humans.
Because they can be programmed to learn and solve problems, cognitive computing systems are disruptive to many industries, including legal, health care, financial services, marketing and customer intelligence, says Saratendu Sethi, Senior Director of Advanced Analytics R&D at SAS.
Cognitive computing examples
Sethi describes a cognitive analytics application in a health care setting: Imagine you walk into the emergency room with red eyes and a fever. Cognitive systems in a triage room can analyze your vitals, correlate them with your medical and travel histories, and predict with accuracy whether you have the common flu, the Zika virus or some other illness.
As this health care example illustrates, cognitive technologies are able to understand the world around us, read signs and understand what’s happening – but in a highly focused context to complete a narrow but important task.
“The goal of many cognitive systems is to provide assistance to humans without human assistance,” says Schabenberger. “But it is important to think about who is being assisted by automated systems.” In the health care example above, the doctor and nurse are being assisted as much as the patient.
Likewise, you might imagine robots completing customer service calls, but Schabenberger says it is more likely that existing customer service representatives would be provided with intelligence from a cognitive computing application that they can then use to improve their offers and service to the customers they are assisting. So, in this case, it’s the person in the call center who’s being assisted. And ultimately the customer gets better assistance too.
We have many steps to take before we can provide reliable assistance to humans without human intervention. But steps are underway. Cognitive systems are already quietly working behind the scenes of many applications. For example, every Google search or Siri interaction is supported by machine learning and cognitive technologies.
How cognitive computing affects jobs
“Automation through cognitive computing will affect every industry, but most periods of industrialization have led to more workers being employed in more valuable positions, not to a net loss of jobs,” explains Schabenberger.
In the legal profession, cognitive computing is already being used to comb through and find important case files quickly, a process that could take weeks or months before. But after that work is complete, lawyers and legal assistants are still needed in the courtroom and during legal proceedings.
“Before, we went through periods of industrial automation,” says Schabenberger. “Cognitive computing is about knowledge automation. In the past, our technologies replaced brawn. Now they’re replacing brain.”
How to get the most out of cognitive computing
In many ways, cognitive computing is a natural extension of existing analytics projects. The challenge for business leaders will be to look for areas where cognitive computing can be applied to business problems.
To find areas in your organization that could benefit from cognitive computing, consider where you have a lot of data, where you might need more automated decisions, or where you might need more personalized interactions with fewer business rules. As the examples above illustrate, the biggest areas of assistance may come from assisting your employees, not your customers. Where do you have activities and systems that can be automated or simplified using data?
"The potential use cases for cognitive systems are as wide, varied and rich as the imagination," said Jessica Goepfert, Program Director for Customer Insights and Analysis at IDC. "Wherever cognitive systems are in play, workers and organizations can expect to be impacted by the power of more information, intelligence and automation."1
These newer and faster capabilities to process text, speech and images essentially provide more data sources for broadening analytics projects with a cognitive component. “Face recognition, text recognition and image recognition are all input for analytics applications,” explains Schabenberger.
Where can you apply deep analytics to human input and automatically produce output that anticipates a need and is easily consumed? If you can think up answers to that question, you might benefit from cognitive computing.