A Quick Guide to Cognitive Computing

What is cognitive computing?

Add bookmark
Seth Adler
Seth Adler
06/19/2019

What is Cognitive Computing?

The definition of cognitive computing varies, and while an official definition has yet to be agreed upon, it is generally accepted that cognitive computing is a branch of artificial intelligence developed to think like and work alongside humans. With the increase of computing power has come a massive wave of new information which is impossible to quickly process manually. Cognitive computing is able to disseminate this data with the purpose of assisting humans in decision-making. It is able to weigh complex, conflicting, and changing information contextually to offer a best-fit solution—not necessarily the most algorithmically correct one.

For example, a cognitive model may suggest a practical, achievable weight loss plan to manage diabetes over the most effective yet unrealistic one. It may also consider age and financial goals when making investment advice, even if the rate of return is lower.

 


Listen to Ingo Zimny, system engineer for DVB Bank, discuss sourcing cognitive solutions

Source: The AIIA Network Podcast

 

The Cognitive Computing Consortium is a forum for the cognitive computing community to come together and advance cognitive computing. They have further defined cognitive computing as having four specific characteristics:

  • Adaptive – As new data is introduced, cognitive systems must be able to adapt on the fly, making real-time adjustments as needed.
  • InteractiveCognitive systems must be able to achieve human-computer interaction (HCI). This means cognitive systems are able to offer and receive information from humans and other areas of technology, such as the cloud or hardware devices.
  • Iterative and StatefulCognitive systems must be able to recognize when a problem is unsolvable and seek the answers necessary to solve it. This is done by remembering and comparing similar past situations.
  • Contextual – Since cognitive computing is mimicking human behavior, it must also be able to understand context. It can use sensor data, unstructured, and structured data to achieve this.

How Does Cognitive Computing Work?

Cognitive computing combines several different technologies to develop its cognitive models.

  • Natural Language Processing - As its goal is to work side by side with and learn from human intelligence, cognitive systems must be able to understand human speech and written text. Natural language processing (NLP) is the area of computer science concerned with this. Cognitive systems can receive, understand, interpret, and offer feedback in written and spoken forms that emulate human syntax. Although this feat is incredibly difficult given the considerable variances in human communication, NLP technology is advancing at a rapid rate.

NLP is responsible for predictive text, Google’s ability to guess search parameters, communications with Siri and Alexa, and natural-sounding chatbots.

While NLP is the most innovative tool cognitive computing utilizes, other artificial intelligence processes work within its parameters as well.

  • Machine Learning - ML uses neural networks, a computer system modeled after how the human brain processes information. It is an algorithm designed to recognize patterns, calculate the probability of a certain outcome occurring, and “learn” through error and successes using a feedback loop.
  • Deep Learning - Deep learning is an approach to machine learning that falls under artificial intelligence (AI), which is most commonly used to label vast and complex data through ANN.
  • Artificial Neural Networks - An artificial neural network (ANN) is a framework modeled after the human brain. This framework consists of inputs, outputs, and hidden layers. Artificial neural networks have artificial neurons that weigh input data and categorize aspects of that data, connect to other neurons, and feed it further down the classification funnel. ANN is an approach to machine learningthat is most commonly used to label data. It is used in computer vision, speech recognition, medical diagnoses, and other ML categorization applications.

 

The Distinction Between Cognitive Computing and AI

Artificial intelligence and cognitive computing work from the same building blocks, but cognitive computing adds a human element to the results. AI is a number cruncher, concerned with receiving large amounts of data, learning from the data, finding patterns in that data, and delivering solutions from that data. Cognitive computing, on the other hand, takes those outputs and examines their nuances.

For example, Google Maps and Fitbit use big data, machine learning, and predictive analytics in their problem solving methods. Siri, Alexa, Watson, and company-specific chatbots does as well, but they can receive and deliver information naturally, leveraging both structured and unstructured data. These products are AI technologies embodied with human-like names and personalities.

 


Tobias Sebastian Unger, head of innovation, digitalization & benchmarking at Siemens Shared Services, discusses Machine Learning and the multi-channel cognitive agent

Source: AIIA.net: AI and Intelligent Automation

 

Traditional artificial intelligence uses structured data to populate its algorithms. Structured data is information that is organized through spreadsheets or formatted data collection. It is stored in fixed fields. The cognitive computing branch of AI is able to operate with unstructured data, which is any data that is not contained in a formatted record. Sensor data both in the form of video and audio transmissions, social media data, and communications data are all forms of unstructured data. It is estimated that over 80 percent of today’s data is unstructured. With new data from the Internet of Things (IoT), this number is expected to grow.  

Additionally, cognitive computing aims to assist humans with the task at hand rather than simply doing it itself or spitting out a specific answer. It has been said of cognitive computing that a man with a machine will always beat a machine.

Currently, a cognitive model must be carefully monitored by humans in order to meet a specific goal and avoid biases. Additionally, cognitive models need clearly defined use cases which can take months to train.  

 

The Third Era of Computing

The first era of computing was the Tabulating Era, from 1900 to 1940, and consisted of machines which used punch cards to input data and perform extremely simple calculations. The Programming Era took over in the 1950s where complex data trees and feedback loops with a specific outcome became programmable. The limitation of this era is that these programs are only as good as their preprogrammed algorithms and data sets. The Cognitive Era is said to have begun in 2011. It is marked by its ability to adapt, make judgments, and self-learn.

The future of cognitive computing will touch every sector and industry vertical, from education, banking, and fraud detection to autonomous vehicles, conservational efforts, and even makeup recommendations. Wherever there is data and a problem to be solved, cognitive computing can play a role.

 

IBM Watson

The gameshow Jeopardy uses clever clues and wordplay in its question and answer format which is why IBM Watson garnered so much attention in 2011 when it competed against two famous Jeopardy champions. Watson is a question-answering computer system that is able to receive unstructured data, make sense of it, and submit an answer in a similarly unstructured or human manner.

Today, Watson is used in the financial sector to advise clients with their banking needs on a level that surpasses a simple chatbot’s capabilities. It is able to distinguish when it should advise, ask follow-up questions or questions for clarity, or escalate the communication to a human. It is the Watson technology that is behind Staples’ Easy Button, reducing errors and ordering time through its customer command system.

While it’s not as glamorous as winning a game of Jeopardy, Watson is even saving lives in the healthcare sector. Doctors are utilizing Watson as virtual assistants who are able to offer diagnosis possibilities, recommend lines of treatment, and even consider a patient’s emotional needs. It is important to remember that cognitive computing isn’t meant to replace human workers, but work alongside them, drawing from all of the data available to reach conclusions which can then be acted upon by humans.

 

DeepMind

DeepMind, now owned by Google, has beaten experts in some of the most complex games like Go, chess, and the videogame StarCraft. DeepMind’s innovation lies in its self-learning ability. AlphaGo, a program fueled by DeepMind, played millions of Go games against itself to learn the nuances and possibilities of the highly complex game, then studied the playstyle of the expert it beat. Because it knew the expert’s playstyle and typical moves, AlphaGo came out of the gate making moves that astounded and confounded its audience before winning the match.

DeepMind is now being used in eye care and other healthcare research and clinics.

 

Conclusion

Cognitive computing is advancing and shaping AI to act alongside humans as digital assistants, allowing humans to make the best decisions based off of history, data, experiences, and desired outcomes.  

 

References

A Beginner’s Guide to Neural Networks and Deep Learning. [webpage]. (n.d.). Skymind.com.

Aasman, J. (2016, January 15). Taming Unstructured Data with Cognitive Computing. Datanami.com.

AlNashwan, N. (2018, January 21). The Age of Cognitive Computing. IBM Developer. Developer.ibm.com.

Carroll, J. (n.d.). Human Computer Interaction - brief intro. The Interaction Design Foundation. Interaction-design.org

Chatbots Magazine. (2017, May 30). Cognitive Computing and Why You Need to Know About It.  Chatbotsmagazine.com.

Evans, K. (2018, January 18). How IBMs Watson and Staples Easy Button help office managers. B2B E-Commerce World. Digitalcommerce360.com.

Greenemeier, L. (2017, October 18). AI versus AI: Self-Taught AlphaGo Zero Vanquishes Its Predecessor. Scientific American. Scientificamerican.com.

Haziyev, S., Milovanov, Y. (2017, January 11). Cognitive Computing: How to Transform Digital Systems to the Next Level of Intelligence. Dataconomy.com.

Kent, E. (2016, May 31). Cognitive computing The human benefit of natural language processing.  Kmworld.com.

Kobielus, J. (2016, June 28). Advancing the art of the cognitive chatbot. InfoWorld. Infoworld.com.

Kumar, D. (2018, July 27). Four ways in which Watson is transforming the healthcare sector. Healthcare Global. Healthcareglobal.com.

Loeffler, J. (2019, January 17). Cognitive Computing: More Human Than Artificial Intelligence. Interestingengineering.com.

Macaulay, T. (2019, March 7). Google DeepMind: the story behind the world's leading AI startup. Techworld.com.

Marr, B. (2016, December 6). What Is The Difference Between Artificial Intelligence And Machine learning? Forbes.com.

Matthews, S. (2016, March 24). What is cognitive IoT? IBM Big Data & Analytics Hub. Ibmbigdatahub.com.

Nair, J. (2018, July 10). Breaking Down Cognitive Computing | What’s it all about?.  atasciencecentral.com.

Rosso, C. (2018, February 6). The Human Bias in the AI Machine. Psychology Today. Psychologytoday.com.

Rouse, M. and Botelho, B. (2016). What is cognitive computing? - Definition from WhatIs.com. SearchEnterpriseAI.

Siddiqui, A. (2018, August 7). Distinguishing between cognitive computing and AI. Intelegain.com.

Taylor, C. (2018, March 28). Structured vs. Unstructured Data. Datamation.com.

Wiggers, K. (2019, January 22). Amazon team taps millions of Alexa interactions to reduce NLP error rate. Venturbeat.com.

Young, L. (2016, April 5). What Has IBM Watson Been Up to Since Winning 'Jeopardy!' 5 Years Ago? Inverse.com.

Zwass, V. (2018, July 27). Neural network. Encyclopædia Britannica. Britannica.com.




RECOMMENDED