Today's PCs and smartphones can do a lot -- from telling you the weather in Zimbabwe in milliseconds, to buying your morning coffee. But ask them to show you what a piece of fabric feels like, or to detect the odor of a great-smelling soup, and they're lost.
That will change in the next five years, says IBM. Computers at that time will be much more aware of the world around them, and be able to understand it. The company's annual "5 in 5" list, in which IBM predicts the five trends in computing that will arrive in five years' time, reads exactly like a list of the five human senses -- predicting computers with sight, hearing, taste, smell and touch.
[More from Mashable: Black Friday Sales Surge as Shoppers Stick to the Web [REPORT]]
The five senses are really all part of one grand concept: cognitive computing, which involves machines experiencing the world more like a human would. For example, a cognizant computer wouldn't see a painting as merely a set of data points describing color, pigment and brush stroke; rather, it would truly see the object holistically as a painting, and be able to know what that means.
"That's a foundationally different way of thinking of computing," Bernie Meyerson, IBM's vice president of innovation, told Mashable in an interview. "You have to change how you think about absorbing data. You can't just take a picture and file the picture. You have to treat the picture as an entity at a very high level, as opposed to just a bunch o' bits."
[More from Mashable: IBM Breaks Another Barrier in Its Race to Beat Moore’s Law]
"[Cognitive computing] makes for some very interesting shifts in capability," he adds. "That's a rather profound sort of driver."
One of the key differences between a cognizant computer and a traditional one is the idea of training. A cognitive system won't just continue to give the same wrong or unhelpful answer; if it arrives at the wrong conclusion, it can change its approach and try again.
"In a cognitive machine, you set it up and run it, but it observes," Meyerson says. "And that's very different because it statistically calculates an end result. However, if that answer is incorrect and you tell it, it'll actually re-weight those probabilities that led it to get the wrong answer and eventually get to the right answer."
Cognition Does Not Equal Intelligence
Attributing human senses to machines can't help but conjure images of androids or self-aware computers capable of independent thought and action. Meyerson says there's a massive chasm separating cognitive computing and true artificial intelligence.
"This is really an assistive technology," he explains. "It can't go off on its own. It's not designed to do that. What it's designed to do, in fact, is respond to a human in an assistive manner. But by providing a human-style of input, it's freed us from the task of programming and moved to the task of training. It simply has -- not more intelligence -- but more bandwidth, and there's a huge difference between the two."
What's your take on cognitive computing? Is IBM on to something with PCs that can taste, smell, touch, hear and see? How would you use the technology? Share your thoughts in the comments.
Homepage photo courtesy of IBM
IBM Predicts the Rise of Cognitive Computing
In its annual "5 in 5" prediction, IBM says that within five years, we will see the rise of cognitive computers -- machines that can experience the world in a similar way that humans do through the five senses. Instead of interpreting objects as a set of data points, a cognitive system would look at the object holistically, as an entity. For example, instead of seeing a painting as a canvas with various colors and brushstrokes, a cognitive computer could interpret it simply as da Vinci's The Mona Lisa. Or a fake of The Mona Lisa. Cognitive systems have advantages over traditional computers by being more efficient and able to learn from mistakes.
This story originally published on Mashable here.
- Technology & Electronics
- cognitive computing