SlideShare une entreprise Scribd logo
1  sur  22
Télécharger pour lire hors ligne
Smart Machines
IBM’S WATSON AND THE ERA
OF COGNITIVE COMPUTING

JOHN E. KELLY III AND STEVE HAMM
●
We are at the dawn of a major shift in the evolution of technology.
The next two decades will transform the way people live and
work just as the computing revolution has transformed the human
landscape over the past half century. The host of opportunities
and challenges that come with this new era will require a new
generation of technologies and a rewriting of the rules of
computing.
The availability of huge amounts of data should help people
better understand complex situations. In reality, though, more data
often lead to more confusion. We make too many decisions with
irrelevant or incorrect information or with data that represent
only part of the picture. So we need a new generation of
tools—cognitive technologies—that help us penetrate complexity
and better understand the world around us so we can make better
decisions and live more successfully and sustainably. Yet some of
the techniques of computer science and engineering are reaching
their limits. The technology industry must change the way it
designs and uses computers and software if it is to continue to
make progress in how we work and live.
This perspective on the future of information technology is
the result of a large and continuing group effort at IBM. A couple
of years ago, a group of IBM Research scientists engaged in an
intriguing project. They looked decades into the future and
sketched out a picture of how computing will change. Their work
sparked discussion and debate among a wide range of IBMers.
We want to expose some of these early thoughts to others and

1
2 Smart Machines

start a broader conversation, so we’re laying out our vision in a
short book, Smart Machines: IBM’s Watson and the Era of Cognitive
Computing, which will be published later in 2013. This first chapter
is a teaser. We want people to know what’s coming.
Laying the foundations for a new era of computing is a
monumental endeavor, and no company can take on this sort of
challenge alone. We look to leading corporate users of information
technology, university researchers, government policy makers,
industry partners, and tech entrepreneurs—indeed, the entire tech
industry—to take this journey with us. We also want to inspire
young people to pursue studies and careers in science and
technology. With this book, we hope to provoke new thinking that
will drive exploration and invention for the next fifty years.
1.

A New Era of
Computing

IBM’s Watson computer created a sensation when it bested two
past grand champions on the TV quiz show Jeopardy!. Tens of
millions of people suddenly understood how “smart” a computer
could be. This was no mere parlor trick; the scientists who
designed Watson built upon decades of research in the fields of
artificial intelligence and natural-language processing and
produced a series of breakthroughs. Their ingenuity made it
possible for a system to excel at a game that requires both
encyclopedic knowledge and lightning-quick recall. In preparation
for the match, the machine ingested more than one million pages
of information. On the TV show, first broadcast in February 2011,
the system was able to search that vast database in response
to questions, size up its confidence level, and, when sufficiently
confident, beat the humans to the buzzer. After more than five
years of intense research and development, a core team of about
twenty scientists had made a very public breakthrough. They
demonstrated that a computing system—using traditional

3
4 Smart Machines

strengths and overcoming assumed limitations—could beat expert
humans in a complex question-and-answer competition using
natural language.
Now IBM scientists and software engineers are busy
improving the Watson technology so it can take on much bigger
and more useful tasks. The Jeopardy! challenge was relatively
limited in scope. It was bound by the rules of the game and the
fact that all the information Watson required could be expressed
in words on a page. In the future, Watson will take on open-ended
problems. It will be able to interpret images, numbers, voices, and
sensory information. It will participate in dialogue with human
beings aimed at navigating vast quantities of information to solve
extremely complicated yet common problems. The goal is to
transform the way humans get things done, from health care and
education to financial services and government.
One of the next challenges for Watson is to help doctors
diagnose diseases and assess the best treatments for individual
patients. IBM is working with physicians at the Cleveland Clinic
and Memorial Sloan-Kettering Cancer Center in New York to train
Watson for this new role. The idea is not to prove that Watson
could do the work of a doctor but to make Watson a useful aid to
a physician. The Jeopardy! challenge pitted man against machine;
with Watson and medicine, man and machine are taking on a
challenge together—and going beyond what either could do on
its own. It’s impossible for even the most accomplished doctors
to keep up with the explosion of new knowledge in their fields.
Watson can keep up to date, though, and provide doctors with
the information they need. Diseases can be freakishly complicated,
and they express themselves differently in each individual. Within
the human genome, there are billions of combinations of variables
that can figure in the course of a disease. So it’s no wonder that an
estimated 15 to 20 percent of medical diagnoses are inaccurate or
A New Era of Computing

5

incomplete.1 Doctors know how to deal with general categories of
diseases and patients. What they need help with is diagnosing and
treating individuals.
Dr. Larry Norton, a world-renowned oncologist at Memorial
Sloan-Kettering Cancer Center who is helping to train Watson,
believes the computer will provide both encyclopedic medical and
patient information and the kind of insights that normally come
only from deeply experienced specialists. But in addition to
knowledge, he believes, Watson will offer wisdom. “This is more
than a machine,” Larry says. “Computer science is going to evolve
rapidly and medicine will evolve with it. This is coevolution. We’ll
help each other.”2
The Coming Era of Cognitive Computing
Watson’s potential to help with health care is just one of
the possibilities opening up for next-generation technologies.
Scientists at IBM and elsewhere are pushing the boundaries of
science and technology fields ranging from nanotechnology to
artificial intelligence with the goal of creating machines that do
much more than calculate and organize and find patterns in
data—they sense, learn, reason, and interact with people in new
ways. Watson’s exploits on TV were one of the first steps into a
new phase in the evolution of information technology—the era of
cognitive computing. Thomas Malone, director of the MIT Center
for Collective Intelligence, says the big question for researchers as
this era unfolds is: How can people and computers be connected
so that collectively they act more intelligently than any person,
1. Dr. Herb Chase, Columbia University School of Medicine, IBM IBV report, “The
Future of Connected Healthcare Devices,” March, 2011.
2. Dr. Larry Norton, Memorial Sloan-Kettering Cancer Center, interview, June 12,
2012.
6 Smart Machines

group, or computer has ever done before? 3 This avenue of thought
stretches back to the computing pioneer J. C. R. Licklider, who
led the U.S. government project that evolved into the Internet.
In 1960 he authored a paper, “Man-Computer Symbiosis,” where
he predicted that “in not too many years, human brains and
computing machines will be coupled together very tightly and
the resulting partnership will think as no human brain has ever
thought and process data in a way not approached by the
information-handling machines we know today.”4 That time is fast
approaching.
The new era of computing is not just an opportunity for
society; it’s also a necessity. Only with the help of thinking
machines will we be able to deal adequately with the exploding
complexity of today’s world and successfully address interlocking
problems like disease and poverty and stress on our natural
systems. Computers today are brilliant idiots. They have
tremendous capacities for storing information and performing
numerical calculations—far superior to those of any human. Yet
when it comes to another class of skills, the capacities for
understanding, learning, adapting, and interacting, computers are
woefully inferior to humans; there are many situations where
computers can’t do a lot to help us.
Up until now, that hasn’t mattered much. Over the past sixtyplus years, computers have transformed the world by automating
defined tasks and processes that can be codified in software
programs in series of procedural “if A, then B”
statements—expressing logic or mathematical equations. Faced
with more complex tasks or changes in tasks, software
programmers add to or modify the steps in the operations they
3. Thomas Malone, Massachusetts Institute of Technology, interview, May 3, 2013.
4. J. C. R. Licklider, “Man-Computer Symbiosis,” IRE Transactions on Human Factors in
Electronics 1 (March 1960): 4–11.
A New Era of Computing

7

want the machine to perform. This model of computing—in which
every step and scenario is determined in advance by a
person—can’t keep up with the world’s evolving social and
business dynamics or deliver on its potential. The emergence of
social networking, sensor networks, and huge storehouses of
business, scientific, and government records creates an abundance
of information that technology leaders call “big data.” Think of it as
a parallel universe to the world of people, places, things, and their
interrelationships, but the digital universe is growing at about 60
percent each year. 5
All of these data create the potential for people to understand
the environment around us with a depth and clarity that was
simply not possible before. Governments and businesses struggle
to understand complex situations, such as the inner workings of
a city or the behavior of global financial markets. In the cognitive
era, using the new tools of decision science, we will be able to
apply new kinds of computing power to huge volumes of data and
achieve deeper insight into how things really work. Armed with
those insights, we can develop strategies and design systems for
achieving the best outcomes—taking into account the effects of
the variable and the unknowable. Think of big data as a natural
resource waiting to be mined. And in order to tap this vast
resource, we need computers that “think” and interact more like
we do.
The human brain evolved over millions of years to become
a remarkable instrument of cognition. We are capable of sorting
through multitudes of sensory impressions in the blink of an eye.
For instance, faced with the chaotic scene of a busy intersection,
we’re able to instantly identify people, vehicles, buildings, streets,
and sidewalks and understand how they relate to one another. We
5. CenturyLink Business Inc., infographic, 2011, http://www.centurylink.com/
business/artifacts/pdf/resources/big-data-defining-the-digital-deluge.pdf.
8 Smart Machines

can recognize and greet a friend we haven’t seen for ten years
even while sensing and prioritizing the need to avoid stepping in
front of a moving bus. Today’s computers can’t do that.
With the exception of robots, tomorrow’s computers won’t
need to navigate in the world the way humans do. But to help
us think better they will need the underlying humanlike
characteristics—learning, adapting, interacting, and some form of
understanding—that make human navigation possible. New
cognitive systems will extract insights from data sources that are
almost totally opaque today, such as population-wide health-care
records, or from new sources of information, such as sensors
monitoring pollution in delicate marine environments. Such
systems will still sometimes be programmed by people using “if
A, then B” logic, but programmers won’t have to anticipate every
procedure and every rule. Instead, computers will be equipped
with interpretive capabilities that will let them learn from the
data and adapt over time as they gain new knowledge or as the
demands on them change.
But the goal is not to replicate human brains or replace
human thinking with machine thinking. Rather, in the era of
cognitive systems, humans and machines will collaborate to
produce better results, each bringing its own skills to the
partnership. The machines will be more rational and analytic—and,
of course, possess encyclopedic memories and tremendous
computational abilities. People will provide judgment, intuition,
empathy, a moral compass, and human creativity.
To understand what’s different about this new era, it helps to
compare it to the two previous eras in the evolution of information
technology. The tabulating era began in the nineteenth century
and continued into the 1940s. Mechanical tabulating machines
automated the process of recording numbers and making
calculations. They were essentially elaborate mechanical abacuses.
A New Era of Computing

9

People used them to organize data and make calculations that
were helpful in everything from conducting a national population
census to tracking the performance of a company’s sales force. The
programmable computing era—today’s technologies—emerged in
the 1940s. Programmable machines are still based on a design laid
out by Hungarian American mathematician John von Neumann.
Electronic devices governed by software programs perform
calculations, execute logical sequences of steps, and store
information using millions of zeros and ones. Scientists built the
first such computers for use in decrypting encoded messages in
wartime. Successive generations of computing technology have
enabled everything from space exploration to global
manufacturing-supply chains to the Internet.
Tomorrow’s cognitive systems will be fundamentally
different from the machines that preceded them. While traditional
computers must be programmed by humans to perform specific
tasks, cognitive systems will learn from their interactions with
data and humans and be able to, in a sense, program themselves
to perform new tasks. Traditional computers are designed to
calculate rapidly; cognitive systems will be designed to draw
inferences from data and pursue the objectives they were given.
Traditional computers have only rudimentary sensing capabilities,
such as license-plate-reading systems on toll roads. Cognitive
systems will be able to sense more like humans do. They’ll
augment our hearing, sight, taste, smell, and touch. In the
programmable-computing era, people have to adapt to the way
computers work. In the cognitive era, computers will adapt to
people. They’ll interact with us in ways that are natural to us.
Von Neumann’s architecture has persisted for such a long
time because it provides a powerful means of performing many
computing tasks. His scheme called for the processing of data via
calculations and the application of logic in a central processing
10 Smart Machines

unit. Today, the CPU is a microprocessor, a stamp-sized sliver of
silicon and metal that’s the brains of everything from smartphones
and laptops to the largest mainframe computers. Other major
components of the von Neumann design are the memory, where
data are stored in the computer while waiting to be processed, and
the technologies that bring data into the system or push it out.
These components are connected to the central processing unit
via a “bus”—essentially a highway for data. Most of the software
programs written for today’s computers are based on this
architecture.
But the design has a flaw that makes it inefficient: the von
Neumann bottleneck. Each element of the process requires
multiple steps where data and instructions are moved back and
forth between memory and the CPU. That requires a tremendous
amount of data movement and processing. It also means that
discrete processing tasks have to be completed linearly, one at a
time. For decades, computer scientists have been able to rapidly
increase the capabilities of central processing units by making
them smaller and faster. But we’re reaching the limits of our ability
to make those gains at a time when we need even more computing
power to deal with complexity and big data. And that’s putting
unbearable demands on today’s computing technologies—mainly
because today’s computers require so much energy to perform
their work.
What’s needed is a new architecture for computing, one that
takes more inspiration from the human brain. Data processing
should be distributed throughout the computing system rather
than concentrated in a CPU. The processing and the memory
should be closely integrated to reduce the shuttling of data and
instructions back and forth. And discrete processing tasks should
be executed simultaneously rather than serially. A cognitive
computer employing these systems will respond to inquires more
A New Era of Computing

11

quickly than today’s computers; less data movement will be
required and less energy will be used. Today’s von Neumann–style
computing won’t go away when cognitive systems come online.
New chip and computing technologies will extend its life far into
the future. In many cases, the cognitive architecture and the von
Neumann architecture will be employed side by side in hybrid
systems. Traditional computing will become ever more capable
while cognitive technologies will do things that were not possible
before.
Today, a handful of technologies are getting a tremendous
amount of buzz, including the cloud, social networking, mobile,
and new ways to interact with computing from tablets to glasses.
These new technologies will fuel the requirement and desire for
cognitive systems that will, for example, both harvest insights
from social networks and enhance our experiences within them.
“This will affect everything. It will be like the discovery of DNA,”
predicts Ralph Gomory, a pioneer of applied mathematics who was
director of IBM Research in the 1970s and 1980s and later head of
the Alfred P. Sloan Foundation.6
How Cognitive Systems Will Help Us Think
As smart as human beings are, there are many things that we
can’t do or that we could do better. Cognitive systems in many
cases help us overcome our limitations.
Complexity
We have difficulty processing large amounts of information
that come at us rapidly. We also have problems understanding
the interactions among elements of large systems, such as all of
6. Ralph Gomory, former IBM Research director, interview, March 19, 2012.
12 Smart Machines

the moving parts in a city or the global economy. With cognitive
computing, we will be able to harvest insights from huge
quantities of data to understand complex situations, make accurate
predictions about the future, and anticipate the unintended
consequences of actions.
City mayors, for instance, will be able to understand the
interrelationships among the subsystems within their
cities—everything from electrical grids to weather to subways
to demographic trends to emergent cultural shifts expressed in
text, video, music, and visual arts. One example is monitoring
social media during a major storm to spot patterns of words and
images that indicate critical problems in particular neighborhoods.
Much of this information will come from sensors—video cameras,
instruments that detect motion or consumption, and devices that
spot anomalies. Mobile phones will also be used as sensors that
help us understand the movements of people. But mayors will
also be able to measure the financial, material, and knowledge
resources they put into a system and the results they get from
those investments. And they’ll be able to accurately predict the
effects of policies and actions they’re considering.
Expertise
With the help of cognitive systems, we will be able to see the
big picture and make better decisions. This is especially important
when we’re trying to address problems that cut across intellectual
and industrial domains. For instance, police will be able to gather
crime statistics and combine them with information about
demographics, events, blueprints, economic activity, and weather
to produce better analysis and safer cities. Armed with abundant
data, police chiefs will be able to set strategies and deploy
resources more effectively—even predicting where and when
A New Era of Computing

13

crimes are likely to happen. Patrol officers will gain a wealth
of information about locations they’re approaching. Situational
intelligence will be extremely useful when they’re about to knock
on the door of an apartment. The ability to achieve such
comprehensive understanding of situations at every level will be
an essential tool and will become one of the most important
factors in the economic growth and competitiveness of cities.
Objectivity
We all possess biases based on our personal experiences,
egos, and intuition about what works and what doesn’t, as well
as the influence group dynamics. Cognitive systems can make
it possible for us to be more objective in our decision making.
Corporations may evolve into “conscious organizations” made up
of humans and systems in collaboration. Sophisticated analytic
engines will understand how an organization works, the dynamics
of its competitive environment, and the capabilities within the
organization and ecosystem of partners. Computers might take
notes at meetings, convert data to graphic images, spot hard-tosee connections, and help guide individuals in achieving business
goals.
Imaginations
Because of our prejudices, we have difficulty envisioning
things that are dramatically different than what we’re familiar
with. Cognitive systems will help us discover and explore new
and contrarian ideas. A chemical or pharmaceutical company’s
research-and-development team might use a cognitive system to
explore combinations of molecules or even individual atoms that
have not been contemplated before. Programs run on high-
14 Smart Machines

performance computers will simulate the effects of the
combinations, spotting potentially valuable new materials or
drugs and also anticipating negative side effects. With the aid
of cognitive machines, researchers and engineers will be able to
explore millions of combinations in ways that are economical with
both time and money.
Senses
We can only take in and make sense of so much raw, physical
information. With cognitive systems, computer sensors teamed
with analytics software will vastly extend our ability to gather
and process such information. Imagine a world where individuals
carry their own personal Watson in the form of a handheld device.
These personal cognitive assistants will carry on conversations
with us that will make the speech technology in today’s
smartphones seem like baby talk. They will acquire knowledge
about us, in part, from observing what we see, say, touch, and
type on our electronic devices—so they can better anticipate our
wishes. In addition, the assistant will be able to use sophisticated
sensing to monitor a person’s health and threats to her well-being.
If there’s carbon monoxide or the influenza virus in a room, for
example, the device will alert its user. Over time, humans have
evolved to be more successful as a species. We continually adapt
to overcome our limitations. This partnership with computers is
simply the latest step in a long process of adaptation.
The uses for cognitive computing will be nearly limitless—a
veritable playground for the human imagination. Think of any
activity that involves a lot of complexity, many variables, a great
deal of uncertainty, and incomplete information and that requires
a high level of expertise and rapid decision making. That activity
is going to be a fat target for cognitive technologies. Just as the
A New Era of Computing

15

personal computer, the Internet, mobile communications, and
social networking have given rise to tens of thousands of software
applications, Web services, and smartphone apps, the cognitive
era will produce a similar explosion of creativity. Think of the
coming technologies as cognitive apps. For enterprises, you can
easily envision apps for handling mergers and acquisitions, crisis
management, competitive analysis, and product design. Picture a
team within a company that’s in charge of sizing up acquisition
candidates using a cognitive M&A app. In order to augment its
understanding of potential targets, many of which are private
companies, the team will set up a special social network of
employees, customers, and business partners who have had direct
experiences with other companies in the same industry. The links
and the nature of the interactions will all be mapped out. A
cognitive system will find information stored there, gathering
insights about companies and suggesting acquisition targets. The
M&A team will also track the performance of previously acquired
companies, finding what worked and what didn’t. Those insights,
constantly updated in the learning system, will help the team
identify risks and synergies, helping it decide which acquisitions
to pursue. Moreover, in the everyday lives of individuals, cognitive
apps will help in selecting a college, making investment decisions,
choosing among insurance options, and purchasing a car or home.
Technology
Necessities

Breakthroughs:

Opportunities

and

Much of the progress in science and technology comes in
small increments. Scientists and engineers build on top of the
innovations that came before. Consider the tablet computer. The
first such devices appeared on the scene back in the 1980s. They
had large, touch-sensitive screens but weighed nearly five pounds
16 Smart Machines

and were an inch and a half thick. They were more like bricks
than books, and about all you could do with them was scrawl
brief memos and fill out forms. After thirty years of gradual
improvements, we have slim, light, powerful tablets that combine
the features of a telephone, a personal computer, a television, and
more.
There’s nothing wrong with incremental innovation. It’s
absolutely necessary, and, sometimes, its results are both
delightful and transformational. A prime example is the iPhone.
With its superior navigation and abundance of easy-to-use
applications, this breakthrough product spawned a burst of
smartphone innovation, which combined with the socialnetworking phenomenon to produce a revolutionary shift in
global human behavior. Yet, technologically, iPhone was built on
top of many smartphone advances that preceded it. In fact, IBM
introduced the first data-accessing phone, called Simon, in 1994,
long before the term “smartphone” had been coined. New waves of
progress, however, require majorly disruptive innovations—things
like the transistor, the microchip, and the first programmable
computers. These are the advances that fundamentally change our
world.
Today, many of the core technologies that provide the basic
functions for traditional computers are mature; they have been in
use for decades. In some cases, each wave of improvements is less
profound than the wave that preceded it. We’re reaching the point
of diminishing returns. Yet, at the same time, the demands on
computing technology are growing exponentially. One example
of a maturing technology is microchips. These slivers of silicon
densely packed with tiny transistors replaced the vacuum tube.
Early on, they brought consumers digital watches and pocketsized radios. Today, a handful of chips provide all the data
processing required in a tablet or a data-center computer that
A New Era of Computing

17

serves up tens of thousands of Facebook pages per day. Yet the
basic concept of a microchip is essentially the same as it was
forty years ago. They’re made of tiny transistors—switches that
turn on and off to create the zeros and ones required to perform
calculations and process information. The more transistors on a
chip, the more data can be processed and stored there. The
problem is that with each new generation of chips, it becomes
more difficult to shrink the wires and transistors and pack more
of them onto chips. So what’s needed is a disruptive innovation
or, more likely, several of them that will change the game in the
microchip realm and launch another period of rapid innovation.
Soon incremental innovation will no longer be sufficient.
People who demand the most from computers are already running
into the limits of today’s circuitry. Michel McCoy, director of
the Simulation and Computing Program at the U.S. Lawrence
Livermore National Laboratory, is among those calling for a
nationwide initiative involving national laboratories and
businesses aimed at coming up with radical new approaches to
microprocessor and computer-system design and software
programming. “In a sense, everything we’ve done up until this
point has been easy,” he says. “Now we have reached a physicsdominated threshold in the design of microprocessors and
computing systems which, unless we do something about it, is
essentially going to stagnate progress.”7 We need more radical
innovations. In the years ahead, a number of fundamental
advances in science and technology will be required to make
progress. Think of those colorful Russian wooden dolls where
progressively smaller dolls nest inside slightly larger ones. We
need to achieve technology advances in layers.
The top layer is the way we interact with computers and get
them to do what we want. The big innovation at this outer layer
7. Michel McCoy, Lawrence Livermore National Laboratory, interview, June 14, 2012.
18 Smart Machines

is “learning systems,” which we will explore deeply in chapter
2. The goal is to create machines that do not require as much
programming by humans. Instead they’ll be “taught” by people,
who will set objectives for them. As they learn, the machines will
work out the details of how to meet those goals.
The next layer represents how we organize and interpret data,
which we’ll discuss in chapter 3. Today’s databases do an excellent
job of organizing information in columns and rows; tomorrow’s
are being designed to manage huge volumes of different kinds of
data, understand information in context, and crunch data in real
time.
Another major dimension, which we’ll go into in chapter 4,
is how to make use of data gathered through sensor technology.
Today, we use rudimentary sensor technologies to perform useful
tasks such as locating leaks in water systems. In the cognitive era,
sensors and pattern-recognition software will augment our senses,
making us hyper-aware of the world around us.
The next layer represents the design of systems—how we fit
together all the physical components that make up a computer.
The challenge here, which we address in chapter 5, is creating
data-centric computers. The designers of computing systems have
long treated logic and memory as separate elements. Now, they
will meld the components together, first, on circuit boards and,
later, on single microchips. Also, they’ll move the processing to the
data, rather than visa versa.
Finally, in the innermost layer is nanotechnology, where we
manipulate matter at the molecular and atomic scale. In chapter
6, we’ll explore what it will take to invent a new physics of
computing. To overcome the limits of today’s microchip
technology, scientists must shift to new nanomaterials and new
approaches to switching from one digital state to another.
A New Era of Computing

19

Possibilities include harnessing quantum mechanics or chips
driven by “synapses and neurons” for data processing.
A New Culture of Innovation
We’re still in the early stages of the emergence of this new era
of computing. Progress will require a willingness to make big bets,
take a long-term view, and engage in open collaboration. We’ll
explore the elements of the culture of innovation in each of the
subsequent chapters in what we call the journeys of discovery. An
absolutely critical aspect of the culture of innovation will be the
ambition and capabilities of the inventors themselves. For rapid
progress to be made in the new era of computing, young people
must be inspired to become scientists, and they must be educated
by teachers using superior tools and techniques. They have to
be rewarded and given opportunities to challenge everything we
think we know about how the world works. It requires dedication
and investment by all of society’s institutions, including families,
local communities, governments, universities, and businesses.
When we ask scientists at IBM Research what motivates
them, the answer is often that they want to change the world—not
in minuscule increments but in great leaps forward. Dr. Mark
Ritter, a senior manager in IBM Research’s Physical Sciences
Department, leads an effort to rethink the entire architecture of
computing for the era of cognitive systems inspired by the human
brain. As a child, Mark, whose father was a plumber, had an
intense curiosity about how things work on a fundamental level.
It was his good fortune that his grandparents, who lived near his
family in Grinnell, Iowa, had two neighbors who were physics
professors at Grinnell College. One of the physicists, whom Mark
pestered with science questions while the neighbor repaired his
VW in the driveway, lent Mark a book on particle physics when
20 Smart Machines

he was about twelve years old. As a teenager, Mark bicycled over
to the campus to attend physics lectures. He built a simple gas
laser in the basement of his home. It was the beginning of decades
of inquiry into how things work and how they can work better.
A few years ago, after more than twenty years in IBM Research,
Mark and his colleagues recognized that the computing model
designed for mid-twentieth-century demands was running out of
gas. So they set about inventing a new one. “This is the most
exciting time in my career,” Mark says. “The old ways of doing
things aren’t going to solve efficiently the big, real-world problems
we face.”
For his part, Dr. Larry Norton of Memorial Sloan-Kettering
Cancer Center is driven to transform the way medicine is
practiced. Ever since he can remember, he was motivated by the
desire to do something with his life that would improve the world.
Born in 1947, he grew up at a time when people saw science as
a powerful means of solving humanity’s problems. He recalls a
thought-crystallizing experience when he was an undergraduate
at the University of Rochester. He lived in a dorm where students
often gathered in the mornings for freewheeling discussions of
politics, values, and ethics. He was already contemplating a career
in medicine, and the topic that day was, if you were a doctor and
had done everything medical science could offer to save a patient
but she died anyway, how would you feel? The students were
split. “I realized I would feel terrible about it,” he says. “Offering
everything available isn’t enough. I should have done better. And
since, because of limitations in the world’s knowledge, I couldn’t
do better, I should be involved in moving things forward.”
During his forty-year career, Larry has been an innovator
in cancer treatment. Among his contributions is the central role
he played in developing the Norton-Simon hypothesis, a oncerevolutionary but now widely used approach to chemotherapy.
A New Era of Computing

21

In Larry’s years as a clinician, he has saved the lives of many
patients, but, of course, some of his patients have died. Those
deaths haunt him. He believes that he owes it to them and to
their children to improve the treatment of cancer and, ultimately,
to help eradicate the disease. He sees his work with Watson as
another way to contribute. Through a machine, he can share his
knowledge and expertise with other physicians. He can help save
cancer victims from a distance—people he has never met.
You will meet a host of innovators in this book. There’s
Dharmendra Modha, the IBM Research manager who is leading
a team of IBM and university scientists in a quest to mimic the
human brain. And there’s Murali Ramanathan, a professor of
pharmaceutical sciences and neurology at State University of New
York, Buffalo, who is studying the role of genetic and
environmental factors in multiple sclerosis. These innovators are
remarkable people. We depend on them to produce the surprising
advances that knock the world off kilter and, ultimately, have the
potential to make it a better place. We will need many of them to
make the transition to the era of cognitive systems. In the end, this
era is not about machines but about the people who design and
use them.

Smart Machines Copyright © 2013 Columbia University Press
This teaser was produced using PressBooks.com,
and PDF rendering was done by PrinceXML.

Contenu connexe

Tendances

Cognitive future part 1
Cognitive future part 1Cognitive future part 1
Cognitive future part 1Peter Tutty
 
Artificial intelligence: Simulation of Intelligence
Artificial intelligence: Simulation of IntelligenceArtificial intelligence: Simulation of Intelligence
Artificial intelligence: Simulation of IntelligenceAbhishek Upadhyay
 
Introduction to Cognitive Computing the science behind and use of IBM Watson
Introduction to Cognitive Computing the science behind and use of IBM WatsonIntroduction to Cognitive Computing the science behind and use of IBM Watson
Introduction to Cognitive Computing the science behind and use of IBM WatsonSubhendu Dey
 
Cognitive Computing by Professor Gordon Pipa
Cognitive Computing by Professor Gordon PipaCognitive Computing by Professor Gordon Pipa
Cognitive Computing by Professor Gordon Pipadiannepatricia
 
Cognitive Era and Introduction to IBM Watson
Cognitive Era and Introduction to IBM WatsonCognitive Era and Introduction to IBM Watson
Cognitive Era and Introduction to IBM WatsonSubhendu Dey
 
EDW 2015 cognitive computing panel session
EDW 2015 cognitive computing panel session EDW 2015 cognitive computing panel session
EDW 2015 cognitive computing panel session Steve Ardire
 
Leading in the Cognitive Age
Leading in the Cognitive AgeLeading in the Cognitive Age
Leading in the Cognitive AgeIBM Watson
 
Leadership talk: Artificial Intelligence Institute at UofSC Feb 2019
Leadership talk: Artificial Intelligence Institute at UofSC Feb 2019Leadership talk: Artificial Intelligence Institute at UofSC Feb 2019
Leadership talk: Artificial Intelligence Institute at UofSC Feb 2019Amit Sheth
 
Knowledge Will Propel Machine Understanding of Big Data
Knowledge Will Propel Machine Understanding of Big DataKnowledge Will Propel Machine Understanding of Big Data
Knowledge Will Propel Machine Understanding of Big DataAmit Sheth
 
Semantic, Cognitive, and Perceptual Computing – three intertwined strands of ...
Semantic, Cognitive, and Perceptual Computing – three intertwined strands of ...Semantic, Cognitive, and Perceptual Computing – three intertwined strands of ...
Semantic, Cognitive, and Perceptual Computing – three intertwined strands of ...Amit Sheth
 
IBM Watson Question-Answering System and Cognitive Computing
IBM Watson Question-Answering System and Cognitive ComputingIBM Watson Question-Answering System and Cognitive Computing
IBM Watson Question-Answering System and Cognitive ComputingRakuten Group, Inc.
 
IBM Watson Ecosystem roadshow - Chicago 4-2-14
IBM Watson Ecosystem roadshow - Chicago 4-2-14IBM Watson Ecosystem roadshow - Chicago 4-2-14
IBM Watson Ecosystem roadshow - Chicago 4-2-14cheribergeron
 
Dark Data: Where the Future Lies
Dark Data: Where the Future LiesDark Data: Where the Future Lies
Dark Data: Where the Future LiesVince Kellen, Ph.D.
 
Smart Data for you and me: Personalized and Actionable Physical Cyber Social ...
Smart Data for you and me: Personalized and Actionable Physical Cyber Social ...Smart Data for you and me: Personalized and Actionable Physical Cyber Social ...
Smart Data for you and me: Personalized and Actionable Physical Cyber Social ...Amit Sheth
 
AI and IBM Watson in legal
AI and IBM Watson in legalAI and IBM Watson in legal
AI and IBM Watson in legalAnders Quitzau
 
Smart IoT for Connected Manufacturing
Smart IoT for Connected ManufacturingSmart IoT for Connected Manufacturing
Smart IoT for Connected ManufacturingAmit Sheth
 
How is Watson Changing the Future of the Automative Industry?
How is Watson Changing the Future of the Automative Industry?How is Watson Changing the Future of the Automative Industry?
How is Watson Changing the Future of the Automative Industry?IBM Watson
 
Climate Change 2015: Continuing Education Programming Implications
Climate Change 2015: Continuing Education Programming ImplicationsClimate Change 2015: Continuing Education Programming Implications
Climate Change 2015: Continuing Education Programming Implicationsdorothydurkin
 
Smart Data - How you and I will exploit Big Data for personalized digital hea...
Smart Data - How you and I will exploit Big Data for personalized digital hea...Smart Data - How you and I will exploit Big Data for personalized digital hea...
Smart Data - How you and I will exploit Big Data for personalized digital hea...Amit Sheth
 

Tendances (20)

Cognitive future part 1
Cognitive future part 1Cognitive future part 1
Cognitive future part 1
 
Artificial intelligence: Simulation of Intelligence
Artificial intelligence: Simulation of IntelligenceArtificial intelligence: Simulation of Intelligence
Artificial intelligence: Simulation of Intelligence
 
Introduction to Cognitive Computing the science behind and use of IBM Watson
Introduction to Cognitive Computing the science behind and use of IBM WatsonIntroduction to Cognitive Computing the science behind and use of IBM Watson
Introduction to Cognitive Computing the science behind and use of IBM Watson
 
Cognitive Computing by Professor Gordon Pipa
Cognitive Computing by Professor Gordon PipaCognitive Computing by Professor Gordon Pipa
Cognitive Computing by Professor Gordon Pipa
 
Deloitte disruption ahead IBM Watson
Deloitte disruption ahead IBM WatsonDeloitte disruption ahead IBM Watson
Deloitte disruption ahead IBM Watson
 
Cognitive Era and Introduction to IBM Watson
Cognitive Era and Introduction to IBM WatsonCognitive Era and Introduction to IBM Watson
Cognitive Era and Introduction to IBM Watson
 
EDW 2015 cognitive computing panel session
EDW 2015 cognitive computing panel session EDW 2015 cognitive computing panel session
EDW 2015 cognitive computing panel session
 
Leading in the Cognitive Age
Leading in the Cognitive AgeLeading in the Cognitive Age
Leading in the Cognitive Age
 
Leadership talk: Artificial Intelligence Institute at UofSC Feb 2019
Leadership talk: Artificial Intelligence Institute at UofSC Feb 2019Leadership talk: Artificial Intelligence Institute at UofSC Feb 2019
Leadership talk: Artificial Intelligence Institute at UofSC Feb 2019
 
Knowledge Will Propel Machine Understanding of Big Data
Knowledge Will Propel Machine Understanding of Big DataKnowledge Will Propel Machine Understanding of Big Data
Knowledge Will Propel Machine Understanding of Big Data
 
Semantic, Cognitive, and Perceptual Computing – three intertwined strands of ...
Semantic, Cognitive, and Perceptual Computing – three intertwined strands of ...Semantic, Cognitive, and Perceptual Computing – three intertwined strands of ...
Semantic, Cognitive, and Perceptual Computing – three intertwined strands of ...
 
IBM Watson Question-Answering System and Cognitive Computing
IBM Watson Question-Answering System and Cognitive ComputingIBM Watson Question-Answering System and Cognitive Computing
IBM Watson Question-Answering System and Cognitive Computing
 
IBM Watson Ecosystem roadshow - Chicago 4-2-14
IBM Watson Ecosystem roadshow - Chicago 4-2-14IBM Watson Ecosystem roadshow - Chicago 4-2-14
IBM Watson Ecosystem roadshow - Chicago 4-2-14
 
Dark Data: Where the Future Lies
Dark Data: Where the Future LiesDark Data: Where the Future Lies
Dark Data: Where the Future Lies
 
Smart Data for you and me: Personalized and Actionable Physical Cyber Social ...
Smart Data for you and me: Personalized and Actionable Physical Cyber Social ...Smart Data for you and me: Personalized and Actionable Physical Cyber Social ...
Smart Data for you and me: Personalized and Actionable Physical Cyber Social ...
 
AI and IBM Watson in legal
AI and IBM Watson in legalAI and IBM Watson in legal
AI and IBM Watson in legal
 
Smart IoT for Connected Manufacturing
Smart IoT for Connected ManufacturingSmart IoT for Connected Manufacturing
Smart IoT for Connected Manufacturing
 
How is Watson Changing the Future of the Automative Industry?
How is Watson Changing the Future of the Automative Industry?How is Watson Changing the Future of the Automative Industry?
How is Watson Changing the Future of the Automative Industry?
 
Climate Change 2015: Continuing Education Programming Implications
Climate Change 2015: Continuing Education Programming ImplicationsClimate Change 2015: Continuing Education Programming Implications
Climate Change 2015: Continuing Education Programming Implications
 
Smart Data - How you and I will exploit Big Data for personalized digital hea...
Smart Data - How you and I will exploit Big Data for personalized digital hea...Smart Data - How you and I will exploit Big Data for personalized digital hea...
Smart Data - How you and I will exploit Big Data for personalized digital hea...
 

Similaire à Smart machines IBM’s watson and the era of cognitive computing

Learning to trust artificial intelligence systems accountability, compliance ...
Learning to trust artificial intelligence systems accountability, compliance ...Learning to trust artificial intelligence systems accountability, compliance ...
Learning to trust artificial intelligence systems accountability, compliance ...Diego Alberto Tamayo
 
Computing, cognition and the future of knowing,. by IBM
Computing, cognition and the future of knowing,. by IBMComputing, cognition and the future of knowing,. by IBM
Computing, cognition and the future of knowing,. by IBMVirginia Fernandez
 
Can we morally justify the replacement of humans by artificial intelligence i...
Can we morally justify the replacement of humans by artificial intelligence i...Can we morally justify the replacement of humans by artificial intelligence i...
Can we morally justify the replacement of humans by artificial intelligence i...Kai Bennink
 
Cognitive Computing : IBM Watson
Cognitive Computing : IBM WatsonCognitive Computing : IBM Watson
Cognitive Computing : IBM WatsonAmit Ranjan
 
Level 1 Individual EcologyWe will measure 3 characteristics o.docx
Level 1 Individual EcologyWe will measure 3 characteristics o.docxLevel 1 Individual EcologyWe will measure 3 characteristics o.docx
Level 1 Individual EcologyWe will measure 3 characteristics o.docxsmile790243
 
Support The Frightfullly Hopeful Future Of Technological...
Support The Frightfullly Hopeful Future Of Technological...Support The Frightfullly Hopeful Future Of Technological...
Support The Frightfullly Hopeful Future Of Technological...Laura Taylor
 
The evolution of AI in workplaces
The evolution of AI in workplacesThe evolution of AI in workplaces
The evolution of AI in workplacesElisabetta Delponte
 
Essay On Use Of Computers In Medicine
Essay On Use Of Computers In MedicineEssay On Use Of Computers In Medicine
Essay On Use Of Computers In MedicineDiana Meza
 
ICCA 2063 - Exploring the Next Fifty Years by Rohit Talwar 03/09/13
ICCA 2063 - Exploring the Next Fifty Years by Rohit Talwar 03/09/13ICCA 2063 - Exploring the Next Fifty Years by Rohit Talwar 03/09/13
ICCA 2063 - Exploring the Next Fifty Years by Rohit Talwar 03/09/13Rohit Talwar
 
Essay On Artificial Intelligence
Essay On Artificial IntelligenceEssay On Artificial Intelligence
Essay On Artificial IntelligenceMichele Johnson
 
HUMAN RIGHTS IN THE AGE OF ARTIFICIAL INTELLIGENCE
HUMAN RIGHTS IN THE AGE OF ARTIFICIAL INTELLIGENCEHUMAN RIGHTS IN THE AGE OF ARTIFICIAL INTELLIGENCE
HUMAN RIGHTS IN THE AGE OF ARTIFICIAL INTELLIGENCEeraser Juan José Calderón
 
EMBD2018 | Humanos y máquinas: Un futuro con inteligencia artificial.
EMBD2018 | Humanos y máquinas: Un futuro con inteligencia artificial.EMBD2018 | Humanos y máquinas: Un futuro con inteligencia artificial.
EMBD2018 | Humanos y máquinas: Un futuro con inteligencia artificial.Laybor EMBdata Training & Consulting
 
Cognitive future part 1
Cognitive future part 1Cognitive future part 1
Cognitive future part 1Peter Tutty
 
The Survival Of Man Depends On The Early Construction Of...
The Survival Of Man Depends On The Early Construction Of...The Survival Of Man Depends On The Early Construction Of...
The Survival Of Man Depends On The Early Construction Of...Rachel Quintana
 
Konica Minolta - Artificial Intelligence White Paper
Konica Minolta - Artificial Intelligence White PaperKonica Minolta - Artificial Intelligence White Paper
Konica Minolta - Artificial Intelligence White PaperEyal Benedek
 
Technological Changes And Evolution
Technological Changes And EvolutionTechnological Changes And Evolution
Technological Changes And EvolutionJennifer Campbell
 

Similaire à Smart machines IBM’s watson and the era of cognitive computing (20)

Learning to trust artificial intelligence systems accountability, compliance ...
Learning to trust artificial intelligence systems accountability, compliance ...Learning to trust artificial intelligence systems accountability, compliance ...
Learning to trust artificial intelligence systems accountability, compliance ...
 
Computing, cognition and the future of knowing,. by IBM
Computing, cognition and the future of knowing,. by IBMComputing, cognition and the future of knowing,. by IBM
Computing, cognition and the future of knowing,. by IBM
 
Can we morally justify the replacement of humans by artificial intelligence i...
Can we morally justify the replacement of humans by artificial intelligence i...Can we morally justify the replacement of humans by artificial intelligence i...
Can we morally justify the replacement of humans by artificial intelligence i...
 
Cognitive Computing : IBM Watson
Cognitive Computing : IBM WatsonCognitive Computing : IBM Watson
Cognitive Computing : IBM Watson
 
Level 1 Individual EcologyWe will measure 3 characteristics o.docx
Level 1 Individual EcologyWe will measure 3 characteristics o.docxLevel 1 Individual EcologyWe will measure 3 characteristics o.docx
Level 1 Individual EcologyWe will measure 3 characteristics o.docx
 
Support The Frightfullly Hopeful Future Of Technological...
Support The Frightfullly Hopeful Future Of Technological...Support The Frightfullly Hopeful Future Of Technological...
Support The Frightfullly Hopeful Future Of Technological...
 
The evolution of AI in workplaces
The evolution of AI in workplacesThe evolution of AI in workplaces
The evolution of AI in workplaces
 
The future of medicine
The future of medicineThe future of medicine
The future of medicine
 
The future of medicine
The future of medicineThe future of medicine
The future of medicine
 
Essay On Use Of Computers In Medicine
Essay On Use Of Computers In MedicineEssay On Use Of Computers In Medicine
Essay On Use Of Computers In Medicine
 
Human plus machine
Human plus machineHuman plus machine
Human plus machine
 
ICCA 2063 - Exploring the Next Fifty Years by Rohit Talwar 03/09/13
ICCA 2063 - Exploring the Next Fifty Years by Rohit Talwar 03/09/13ICCA 2063 - Exploring the Next Fifty Years by Rohit Talwar 03/09/13
ICCA 2063 - Exploring the Next Fifty Years by Rohit Talwar 03/09/13
 
Essay On Artificial Intelligence
Essay On Artificial IntelligenceEssay On Artificial Intelligence
Essay On Artificial Intelligence
 
HUMAN RIGHTS IN THE AGE OF ARTIFICIAL INTELLIGENCE
HUMAN RIGHTS IN THE AGE OF ARTIFICIAL INTELLIGENCEHUMAN RIGHTS IN THE AGE OF ARTIFICIAL INTELLIGENCE
HUMAN RIGHTS IN THE AGE OF ARTIFICIAL INTELLIGENCE
 
EMBD2018 | Humanos y máquinas: Un futuro con inteligencia artificial.
EMBD2018 | Humanos y máquinas: Un futuro con inteligencia artificial.EMBD2018 | Humanos y máquinas: Un futuro con inteligencia artificial.
EMBD2018 | Humanos y máquinas: Un futuro con inteligencia artificial.
 
Cognitive future part 1
Cognitive future part 1Cognitive future part 1
Cognitive future part 1
 
The Survival Of Man Depends On The Early Construction Of...
The Survival Of Man Depends On The Early Construction Of...The Survival Of Man Depends On The Early Construction Of...
The Survival Of Man Depends On The Early Construction Of...
 
Konica Minolta - Artificial Intelligence White Paper
Konica Minolta - Artificial Intelligence White PaperKonica Minolta - Artificial Intelligence White Paper
Konica Minolta - Artificial Intelligence White Paper
 
Taking IT for Granted - David De Roure
Taking IT for Granted - David De RoureTaking IT for Granted - David De Roure
Taking IT for Granted - David De Roure
 
Technological Changes And Evolution
Technological Changes And EvolutionTechnological Changes And Evolution
Technological Changes And Evolution
 

Plus de Wirehead Technology

Whospoppin Healthcare App Presentation
Whospoppin Healthcare App PresentationWhospoppin Healthcare App Presentation
Whospoppin Healthcare App PresentationWirehead Technology
 
IBM Healthcare 2015 White Paper
IBM Healthcare 2015 White Paper IBM Healthcare 2015 White Paper
IBM Healthcare 2015 White Paper Wirehead Technology
 
Getting A Better Grip on Mobile Devices
Getting A Better Grip on Mobile DevicesGetting A Better Grip on Mobile Devices
Getting A Better Grip on Mobile DevicesWirehead Technology
 
IBM Mobile First Enterprise Report
IBM Mobile First Enterprise ReportIBM Mobile First Enterprise Report
IBM Mobile First Enterprise ReportWirehead Technology
 
Consumer ED ILHIE Toolkit for Professionals
Consumer ED  ILHIE Toolkit for ProfessionalsConsumer ED  ILHIE Toolkit for Professionals
Consumer ED ILHIE Toolkit for ProfessionalsWirehead Technology
 
Consumer ED ILHIE toolkit for consumers
Consumer ED ILHIE toolkit for consumersConsumer ED ILHIE toolkit for consumers
Consumer ED ILHIE toolkit for consumersWirehead Technology
 
What Illinois Healthcare Law Will Mean To Your Business
What Illinois Healthcare Law Will Mean To Your BusinessWhat Illinois Healthcare Law Will Mean To Your Business
What Illinois Healthcare Law Will Mean To Your BusinessWirehead Technology
 
Social business advent of a new age
Social business advent of a new ageSocial business advent of a new age
Social business advent of a new ageWirehead Technology
 

Plus de Wirehead Technology (20)

Future of Retail
Future of RetailFuture of Retail
Future of Retail
 
Whospoppin Healthcare App Presentation
Whospoppin Healthcare App PresentationWhospoppin Healthcare App Presentation
Whospoppin Healthcare App Presentation
 
IBM Healthcare 2015 White Paper
IBM Healthcare 2015 White Paper IBM Healthcare 2015 White Paper
IBM Healthcare 2015 White Paper
 
Getting A Better Grip on Mobile Devices
Getting A Better Grip on Mobile DevicesGetting A Better Grip on Mobile Devices
Getting A Better Grip on Mobile Devices
 
IBM Mobile First Enterprise Report
IBM Mobile First Enterprise ReportIBM Mobile First Enterprise Report
IBM Mobile First Enterprise Report
 
HP Whitepaper BYOD in Healthcare
 HP Whitepaper BYOD in Healthcare  HP Whitepaper BYOD in Healthcare
HP Whitepaper BYOD in Healthcare
 
Consumer ED ILHIE Toolkit for Professionals
Consumer ED  ILHIE Toolkit for ProfessionalsConsumer ED  ILHIE Toolkit for Professionals
Consumer ED ILHIE Toolkit for Professionals
 
Consumer ED ILHIE toolkit for consumers
Consumer ED ILHIE toolkit for consumersConsumer ED ILHIE toolkit for consumers
Consumer ED ILHIE toolkit for consumers
 
What Illinois Healthcare Law Will Mean To Your Business
What Illinois Healthcare Law Will Mean To Your BusinessWhat Illinois Healthcare Law Will Mean To Your Business
What Illinois Healthcare Law Will Mean To Your Business
 
HIPAA Basic Healthcare Guide
HIPAA Basic Healthcare GuideHIPAA Basic Healthcare Guide
HIPAA Basic Healthcare Guide
 
IBM Collective intelligence
IBM Collective intelligenceIBM Collective intelligence
IBM Collective intelligence
 
IBM Becoming a Social Business
IBM Becoming a Social BusinessIBM Becoming a Social Business
IBM Becoming a Social Business
 
IBM Social Business Behavior
IBM Social Business BehaviorIBM Social Business Behavior
IBM Social Business Behavior
 
Social business advent of a new age
Social business advent of a new ageSocial business advent of a new age
Social business advent of a new age
 
Ibmsocial
IbmsocialIbmsocial
Ibmsocial
 
Case study
Case studyCase study
Case study
 
Healthcarearticle
HealthcarearticleHealthcarearticle
Healthcarearticle
 
Healthcare IT Managed Services
Healthcare IT Managed ServicesHealthcare IT Managed Services
Healthcare IT Managed Services
 
Health information exchanges
Health information exchangesHealth information exchanges
Health information exchanges
 
7 Medical MD Suite Preview
7 Medical MD Suite Preview7 Medical MD Suite Preview
7 Medical MD Suite Preview
 

Dernier

activity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdf
activity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdf
activity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdfJamie (Taka) Wang
 
Artificial Intelligence & SEO Trends for 2024
Artificial Intelligence & SEO Trends for 2024Artificial Intelligence & SEO Trends for 2024
Artificial Intelligence & SEO Trends for 2024D Cloud Solutions
 
Linked Data in Production: Moving Beyond Ontologies
Linked Data in Production: Moving Beyond OntologiesLinked Data in Production: Moving Beyond Ontologies
Linked Data in Production: Moving Beyond OntologiesDavid Newbury
 
Crea il tuo assistente AI con lo Stregatto (open source python framework)
Crea il tuo assistente AI con lo Stregatto (open source python framework)Crea il tuo assistente AI con lo Stregatto (open source python framework)
Crea il tuo assistente AI con lo Stregatto (open source python framework)Commit University
 
Secure your environment with UiPath and CyberArk technologies - Session 1
Secure your environment with UiPath and CyberArk technologies - Session 1Secure your environment with UiPath and CyberArk technologies - Session 1
Secure your environment with UiPath and CyberArk technologies - Session 1DianaGray10
 
NIST Cybersecurity Framework (CSF) 2.0 Workshop
NIST Cybersecurity Framework (CSF) 2.0 WorkshopNIST Cybersecurity Framework (CSF) 2.0 Workshop
NIST Cybersecurity Framework (CSF) 2.0 WorkshopBachir Benyammi
 
UiPath Studio Web workshop series - Day 7
UiPath Studio Web workshop series - Day 7UiPath Studio Web workshop series - Day 7
UiPath Studio Web workshop series - Day 7DianaGray10
 
How Accurate are Carbon Emissions Projections?
How Accurate are Carbon Emissions Projections?How Accurate are Carbon Emissions Projections?
How Accurate are Carbon Emissions Projections?IES VE
 
Comparing Sidecar-less Service Mesh from Cilium and Istio
Comparing Sidecar-less Service Mesh from Cilium and IstioComparing Sidecar-less Service Mesh from Cilium and Istio
Comparing Sidecar-less Service Mesh from Cilium and IstioChristian Posta
 
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...Will Schroeder
 
COMPUTER 10 Lesson 8 - Building a Website
COMPUTER 10 Lesson 8 - Building a WebsiteCOMPUTER 10 Lesson 8 - Building a Website
COMPUTER 10 Lesson 8 - Building a Websitedgelyza
 
UWB Technology for Enhanced Indoor and Outdoor Positioning in Physiological M...
UWB Technology for Enhanced Indoor and Outdoor Positioning in Physiological M...UWB Technology for Enhanced Indoor and Outdoor Positioning in Physiological M...
UWB Technology for Enhanced Indoor and Outdoor Positioning in Physiological M...UbiTrack UK
 
9 Steps For Building Winning Founding Team
9 Steps For Building Winning Founding Team9 Steps For Building Winning Founding Team
9 Steps For Building Winning Founding TeamAdam Moalla
 
IESVE Software for Florida Code Compliance Using ASHRAE 90.1-2019
IESVE Software for Florida Code Compliance Using ASHRAE 90.1-2019IESVE Software for Florida Code Compliance Using ASHRAE 90.1-2019
IESVE Software for Florida Code Compliance Using ASHRAE 90.1-2019IES VE
 
Connector Corner: Extending LLM automation use cases with UiPath GenAI connec...
Connector Corner: Extending LLM automation use cases with UiPath GenAI connec...Connector Corner: Extending LLM automation use cases with UiPath GenAI connec...
Connector Corner: Extending LLM automation use cases with UiPath GenAI connec...DianaGray10
 
UiPath Platform: The Backend Engine Powering Your Automation - Session 1
UiPath Platform: The Backend Engine Powering Your Automation - Session 1UiPath Platform: The Backend Engine Powering Your Automation - Session 1
UiPath Platform: The Backend Engine Powering Your Automation - Session 1DianaGray10
 
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDE
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDEADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDE
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDELiveplex
 
Meet the new FSP 3000 M-Flex800™
Meet the new FSP 3000 M-Flex800™Meet the new FSP 3000 M-Flex800™
Meet the new FSP 3000 M-Flex800™Adtran
 
Igniting Next Level Productivity with AI-Infused Data Integration Workflows
Igniting Next Level Productivity with AI-Infused Data Integration WorkflowsIgniting Next Level Productivity with AI-Infused Data Integration Workflows
Igniting Next Level Productivity with AI-Infused Data Integration WorkflowsSafe Software
 
AI You Can Trust - Ensuring Success with Data Integrity Webinar
AI You Can Trust - Ensuring Success with Data Integrity WebinarAI You Can Trust - Ensuring Success with Data Integrity Webinar
AI You Can Trust - Ensuring Success with Data Integrity WebinarPrecisely
 

Dernier (20)

activity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdf
activity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdf
activity_diagram_combine_v4_20190827.pdfactivity_diagram_combine_v4_20190827.pdf
 
Artificial Intelligence & SEO Trends for 2024
Artificial Intelligence & SEO Trends for 2024Artificial Intelligence & SEO Trends for 2024
Artificial Intelligence & SEO Trends for 2024
 
Linked Data in Production: Moving Beyond Ontologies
Linked Data in Production: Moving Beyond OntologiesLinked Data in Production: Moving Beyond Ontologies
Linked Data in Production: Moving Beyond Ontologies
 
Crea il tuo assistente AI con lo Stregatto (open source python framework)
Crea il tuo assistente AI con lo Stregatto (open source python framework)Crea il tuo assistente AI con lo Stregatto (open source python framework)
Crea il tuo assistente AI con lo Stregatto (open source python framework)
 
Secure your environment with UiPath and CyberArk technologies - Session 1
Secure your environment with UiPath and CyberArk technologies - Session 1Secure your environment with UiPath and CyberArk technologies - Session 1
Secure your environment with UiPath and CyberArk technologies - Session 1
 
NIST Cybersecurity Framework (CSF) 2.0 Workshop
NIST Cybersecurity Framework (CSF) 2.0 WorkshopNIST Cybersecurity Framework (CSF) 2.0 Workshop
NIST Cybersecurity Framework (CSF) 2.0 Workshop
 
UiPath Studio Web workshop series - Day 7
UiPath Studio Web workshop series - Day 7UiPath Studio Web workshop series - Day 7
UiPath Studio Web workshop series - Day 7
 
How Accurate are Carbon Emissions Projections?
How Accurate are Carbon Emissions Projections?How Accurate are Carbon Emissions Projections?
How Accurate are Carbon Emissions Projections?
 
Comparing Sidecar-less Service Mesh from Cilium and Istio
Comparing Sidecar-less Service Mesh from Cilium and IstioComparing Sidecar-less Service Mesh from Cilium and Istio
Comparing Sidecar-less Service Mesh from Cilium and Istio
 
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...
Apres-Cyber - The Data Dilemma: Bridging Offensive Operations and Machine Lea...
 
COMPUTER 10 Lesson 8 - Building a Website
COMPUTER 10 Lesson 8 - Building a WebsiteCOMPUTER 10 Lesson 8 - Building a Website
COMPUTER 10 Lesson 8 - Building a Website
 
UWB Technology for Enhanced Indoor and Outdoor Positioning in Physiological M...
UWB Technology for Enhanced Indoor and Outdoor Positioning in Physiological M...UWB Technology for Enhanced Indoor and Outdoor Positioning in Physiological M...
UWB Technology for Enhanced Indoor and Outdoor Positioning in Physiological M...
 
9 Steps For Building Winning Founding Team
9 Steps For Building Winning Founding Team9 Steps For Building Winning Founding Team
9 Steps For Building Winning Founding Team
 
IESVE Software for Florida Code Compliance Using ASHRAE 90.1-2019
IESVE Software for Florida Code Compliance Using ASHRAE 90.1-2019IESVE Software for Florida Code Compliance Using ASHRAE 90.1-2019
IESVE Software for Florida Code Compliance Using ASHRAE 90.1-2019
 
Connector Corner: Extending LLM automation use cases with UiPath GenAI connec...
Connector Corner: Extending LLM automation use cases with UiPath GenAI connec...Connector Corner: Extending LLM automation use cases with UiPath GenAI connec...
Connector Corner: Extending LLM automation use cases with UiPath GenAI connec...
 
UiPath Platform: The Backend Engine Powering Your Automation - Session 1
UiPath Platform: The Backend Engine Powering Your Automation - Session 1UiPath Platform: The Backend Engine Powering Your Automation - Session 1
UiPath Platform: The Backend Engine Powering Your Automation - Session 1
 
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDE
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDEADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDE
ADOPTING WEB 3 FOR YOUR BUSINESS: A STEP-BY-STEP GUIDE
 
Meet the new FSP 3000 M-Flex800™
Meet the new FSP 3000 M-Flex800™Meet the new FSP 3000 M-Flex800™
Meet the new FSP 3000 M-Flex800™
 
Igniting Next Level Productivity with AI-Infused Data Integration Workflows
Igniting Next Level Productivity with AI-Infused Data Integration WorkflowsIgniting Next Level Productivity with AI-Infused Data Integration Workflows
Igniting Next Level Productivity with AI-Infused Data Integration Workflows
 
AI You Can Trust - Ensuring Success with Data Integrity Webinar
AI You Can Trust - Ensuring Success with Data Integrity WebinarAI You Can Trust - Ensuring Success with Data Integrity Webinar
AI You Can Trust - Ensuring Success with Data Integrity Webinar
 

Smart machines IBM’s watson and the era of cognitive computing

  • 1. Smart Machines IBM’S WATSON AND THE ERA OF COGNITIVE COMPUTING JOHN E. KELLY III AND STEVE HAMM ●
  • 2. We are at the dawn of a major shift in the evolution of technology. The next two decades will transform the way people live and work just as the computing revolution has transformed the human landscape over the past half century. The host of opportunities and challenges that come with this new era will require a new generation of technologies and a rewriting of the rules of computing. The availability of huge amounts of data should help people better understand complex situations. In reality, though, more data often lead to more confusion. We make too many decisions with irrelevant or incorrect information or with data that represent only part of the picture. So we need a new generation of tools—cognitive technologies—that help us penetrate complexity and better understand the world around us so we can make better decisions and live more successfully and sustainably. Yet some of the techniques of computer science and engineering are reaching their limits. The technology industry must change the way it designs and uses computers and software if it is to continue to make progress in how we work and live. This perspective on the future of information technology is the result of a large and continuing group effort at IBM. A couple of years ago, a group of IBM Research scientists engaged in an intriguing project. They looked decades into the future and sketched out a picture of how computing will change. Their work sparked discussion and debate among a wide range of IBMers. We want to expose some of these early thoughts to others and 1
  • 3. 2 Smart Machines start a broader conversation, so we’re laying out our vision in a short book, Smart Machines: IBM’s Watson and the Era of Cognitive Computing, which will be published later in 2013. This first chapter is a teaser. We want people to know what’s coming. Laying the foundations for a new era of computing is a monumental endeavor, and no company can take on this sort of challenge alone. We look to leading corporate users of information technology, university researchers, government policy makers, industry partners, and tech entrepreneurs—indeed, the entire tech industry—to take this journey with us. We also want to inspire young people to pursue studies and careers in science and technology. With this book, we hope to provoke new thinking that will drive exploration and invention for the next fifty years.
  • 4. 1. A New Era of Computing IBM’s Watson computer created a sensation when it bested two past grand champions on the TV quiz show Jeopardy!. Tens of millions of people suddenly understood how “smart” a computer could be. This was no mere parlor trick; the scientists who designed Watson built upon decades of research in the fields of artificial intelligence and natural-language processing and produced a series of breakthroughs. Their ingenuity made it possible for a system to excel at a game that requires both encyclopedic knowledge and lightning-quick recall. In preparation for the match, the machine ingested more than one million pages of information. On the TV show, first broadcast in February 2011, the system was able to search that vast database in response to questions, size up its confidence level, and, when sufficiently confident, beat the humans to the buzzer. After more than five years of intense research and development, a core team of about twenty scientists had made a very public breakthrough. They demonstrated that a computing system—using traditional 3
  • 5. 4 Smart Machines strengths and overcoming assumed limitations—could beat expert humans in a complex question-and-answer competition using natural language. Now IBM scientists and software engineers are busy improving the Watson technology so it can take on much bigger and more useful tasks. The Jeopardy! challenge was relatively limited in scope. It was bound by the rules of the game and the fact that all the information Watson required could be expressed in words on a page. In the future, Watson will take on open-ended problems. It will be able to interpret images, numbers, voices, and sensory information. It will participate in dialogue with human beings aimed at navigating vast quantities of information to solve extremely complicated yet common problems. The goal is to transform the way humans get things done, from health care and education to financial services and government. One of the next challenges for Watson is to help doctors diagnose diseases and assess the best treatments for individual patients. IBM is working with physicians at the Cleveland Clinic and Memorial Sloan-Kettering Cancer Center in New York to train Watson for this new role. The idea is not to prove that Watson could do the work of a doctor but to make Watson a useful aid to a physician. The Jeopardy! challenge pitted man against machine; with Watson and medicine, man and machine are taking on a challenge together—and going beyond what either could do on its own. It’s impossible for even the most accomplished doctors to keep up with the explosion of new knowledge in their fields. Watson can keep up to date, though, and provide doctors with the information they need. Diseases can be freakishly complicated, and they express themselves differently in each individual. Within the human genome, there are billions of combinations of variables that can figure in the course of a disease. So it’s no wonder that an estimated 15 to 20 percent of medical diagnoses are inaccurate or
  • 6. A New Era of Computing 5 incomplete.1 Doctors know how to deal with general categories of diseases and patients. What they need help with is diagnosing and treating individuals. Dr. Larry Norton, a world-renowned oncologist at Memorial Sloan-Kettering Cancer Center who is helping to train Watson, believes the computer will provide both encyclopedic medical and patient information and the kind of insights that normally come only from deeply experienced specialists. But in addition to knowledge, he believes, Watson will offer wisdom. “This is more than a machine,” Larry says. “Computer science is going to evolve rapidly and medicine will evolve with it. This is coevolution. We’ll help each other.”2 The Coming Era of Cognitive Computing Watson’s potential to help with health care is just one of the possibilities opening up for next-generation technologies. Scientists at IBM and elsewhere are pushing the boundaries of science and technology fields ranging from nanotechnology to artificial intelligence with the goal of creating machines that do much more than calculate and organize and find patterns in data—they sense, learn, reason, and interact with people in new ways. Watson’s exploits on TV were one of the first steps into a new phase in the evolution of information technology—the era of cognitive computing. Thomas Malone, director of the MIT Center for Collective Intelligence, says the big question for researchers as this era unfolds is: How can people and computers be connected so that collectively they act more intelligently than any person, 1. Dr. Herb Chase, Columbia University School of Medicine, IBM IBV report, “The Future of Connected Healthcare Devices,” March, 2011. 2. Dr. Larry Norton, Memorial Sloan-Kettering Cancer Center, interview, June 12, 2012.
  • 7. 6 Smart Machines group, or computer has ever done before? 3 This avenue of thought stretches back to the computing pioneer J. C. R. Licklider, who led the U.S. government project that evolved into the Internet. In 1960 he authored a paper, “Man-Computer Symbiosis,” where he predicted that “in not too many years, human brains and computing machines will be coupled together very tightly and the resulting partnership will think as no human brain has ever thought and process data in a way not approached by the information-handling machines we know today.”4 That time is fast approaching. The new era of computing is not just an opportunity for society; it’s also a necessity. Only with the help of thinking machines will we be able to deal adequately with the exploding complexity of today’s world and successfully address interlocking problems like disease and poverty and stress on our natural systems. Computers today are brilliant idiots. They have tremendous capacities for storing information and performing numerical calculations—far superior to those of any human. Yet when it comes to another class of skills, the capacities for understanding, learning, adapting, and interacting, computers are woefully inferior to humans; there are many situations where computers can’t do a lot to help us. Up until now, that hasn’t mattered much. Over the past sixtyplus years, computers have transformed the world by automating defined tasks and processes that can be codified in software programs in series of procedural “if A, then B” statements—expressing logic or mathematical equations. Faced with more complex tasks or changes in tasks, software programmers add to or modify the steps in the operations they 3. Thomas Malone, Massachusetts Institute of Technology, interview, May 3, 2013. 4. J. C. R. Licklider, “Man-Computer Symbiosis,” IRE Transactions on Human Factors in Electronics 1 (March 1960): 4–11.
  • 8. A New Era of Computing 7 want the machine to perform. This model of computing—in which every step and scenario is determined in advance by a person—can’t keep up with the world’s evolving social and business dynamics or deliver on its potential. The emergence of social networking, sensor networks, and huge storehouses of business, scientific, and government records creates an abundance of information that technology leaders call “big data.” Think of it as a parallel universe to the world of people, places, things, and their interrelationships, but the digital universe is growing at about 60 percent each year. 5 All of these data create the potential for people to understand the environment around us with a depth and clarity that was simply not possible before. Governments and businesses struggle to understand complex situations, such as the inner workings of a city or the behavior of global financial markets. In the cognitive era, using the new tools of decision science, we will be able to apply new kinds of computing power to huge volumes of data and achieve deeper insight into how things really work. Armed with those insights, we can develop strategies and design systems for achieving the best outcomes—taking into account the effects of the variable and the unknowable. Think of big data as a natural resource waiting to be mined. And in order to tap this vast resource, we need computers that “think” and interact more like we do. The human brain evolved over millions of years to become a remarkable instrument of cognition. We are capable of sorting through multitudes of sensory impressions in the blink of an eye. For instance, faced with the chaotic scene of a busy intersection, we’re able to instantly identify people, vehicles, buildings, streets, and sidewalks and understand how they relate to one another. We 5. CenturyLink Business Inc., infographic, 2011, http://www.centurylink.com/ business/artifacts/pdf/resources/big-data-defining-the-digital-deluge.pdf.
  • 9. 8 Smart Machines can recognize and greet a friend we haven’t seen for ten years even while sensing and prioritizing the need to avoid stepping in front of a moving bus. Today’s computers can’t do that. With the exception of robots, tomorrow’s computers won’t need to navigate in the world the way humans do. But to help us think better they will need the underlying humanlike characteristics—learning, adapting, interacting, and some form of understanding—that make human navigation possible. New cognitive systems will extract insights from data sources that are almost totally opaque today, such as population-wide health-care records, or from new sources of information, such as sensors monitoring pollution in delicate marine environments. Such systems will still sometimes be programmed by people using “if A, then B” logic, but programmers won’t have to anticipate every procedure and every rule. Instead, computers will be equipped with interpretive capabilities that will let them learn from the data and adapt over time as they gain new knowledge or as the demands on them change. But the goal is not to replicate human brains or replace human thinking with machine thinking. Rather, in the era of cognitive systems, humans and machines will collaborate to produce better results, each bringing its own skills to the partnership. The machines will be more rational and analytic—and, of course, possess encyclopedic memories and tremendous computational abilities. People will provide judgment, intuition, empathy, a moral compass, and human creativity. To understand what’s different about this new era, it helps to compare it to the two previous eras in the evolution of information technology. The tabulating era began in the nineteenth century and continued into the 1940s. Mechanical tabulating machines automated the process of recording numbers and making calculations. They were essentially elaborate mechanical abacuses.
  • 10. A New Era of Computing 9 People used them to organize data and make calculations that were helpful in everything from conducting a national population census to tracking the performance of a company’s sales force. The programmable computing era—today’s technologies—emerged in the 1940s. Programmable machines are still based on a design laid out by Hungarian American mathematician John von Neumann. Electronic devices governed by software programs perform calculations, execute logical sequences of steps, and store information using millions of zeros and ones. Scientists built the first such computers for use in decrypting encoded messages in wartime. Successive generations of computing technology have enabled everything from space exploration to global manufacturing-supply chains to the Internet. Tomorrow’s cognitive systems will be fundamentally different from the machines that preceded them. While traditional computers must be programmed by humans to perform specific tasks, cognitive systems will learn from their interactions with data and humans and be able to, in a sense, program themselves to perform new tasks. Traditional computers are designed to calculate rapidly; cognitive systems will be designed to draw inferences from data and pursue the objectives they were given. Traditional computers have only rudimentary sensing capabilities, such as license-plate-reading systems on toll roads. Cognitive systems will be able to sense more like humans do. They’ll augment our hearing, sight, taste, smell, and touch. In the programmable-computing era, people have to adapt to the way computers work. In the cognitive era, computers will adapt to people. They’ll interact with us in ways that are natural to us. Von Neumann’s architecture has persisted for such a long time because it provides a powerful means of performing many computing tasks. His scheme called for the processing of data via calculations and the application of logic in a central processing
  • 11. 10 Smart Machines unit. Today, the CPU is a microprocessor, a stamp-sized sliver of silicon and metal that’s the brains of everything from smartphones and laptops to the largest mainframe computers. Other major components of the von Neumann design are the memory, where data are stored in the computer while waiting to be processed, and the technologies that bring data into the system or push it out. These components are connected to the central processing unit via a “bus”—essentially a highway for data. Most of the software programs written for today’s computers are based on this architecture. But the design has a flaw that makes it inefficient: the von Neumann bottleneck. Each element of the process requires multiple steps where data and instructions are moved back and forth between memory and the CPU. That requires a tremendous amount of data movement and processing. It also means that discrete processing tasks have to be completed linearly, one at a time. For decades, computer scientists have been able to rapidly increase the capabilities of central processing units by making them smaller and faster. But we’re reaching the limits of our ability to make those gains at a time when we need even more computing power to deal with complexity and big data. And that’s putting unbearable demands on today’s computing technologies—mainly because today’s computers require so much energy to perform their work. What’s needed is a new architecture for computing, one that takes more inspiration from the human brain. Data processing should be distributed throughout the computing system rather than concentrated in a CPU. The processing and the memory should be closely integrated to reduce the shuttling of data and instructions back and forth. And discrete processing tasks should be executed simultaneously rather than serially. A cognitive computer employing these systems will respond to inquires more
  • 12. A New Era of Computing 11 quickly than today’s computers; less data movement will be required and less energy will be used. Today’s von Neumann–style computing won’t go away when cognitive systems come online. New chip and computing technologies will extend its life far into the future. In many cases, the cognitive architecture and the von Neumann architecture will be employed side by side in hybrid systems. Traditional computing will become ever more capable while cognitive technologies will do things that were not possible before. Today, a handful of technologies are getting a tremendous amount of buzz, including the cloud, social networking, mobile, and new ways to interact with computing from tablets to glasses. These new technologies will fuel the requirement and desire for cognitive systems that will, for example, both harvest insights from social networks and enhance our experiences within them. “This will affect everything. It will be like the discovery of DNA,” predicts Ralph Gomory, a pioneer of applied mathematics who was director of IBM Research in the 1970s and 1980s and later head of the Alfred P. Sloan Foundation.6 How Cognitive Systems Will Help Us Think As smart as human beings are, there are many things that we can’t do or that we could do better. Cognitive systems in many cases help us overcome our limitations. Complexity We have difficulty processing large amounts of information that come at us rapidly. We also have problems understanding the interactions among elements of large systems, such as all of 6. Ralph Gomory, former IBM Research director, interview, March 19, 2012.
  • 13. 12 Smart Machines the moving parts in a city or the global economy. With cognitive computing, we will be able to harvest insights from huge quantities of data to understand complex situations, make accurate predictions about the future, and anticipate the unintended consequences of actions. City mayors, for instance, will be able to understand the interrelationships among the subsystems within their cities—everything from electrical grids to weather to subways to demographic trends to emergent cultural shifts expressed in text, video, music, and visual arts. One example is monitoring social media during a major storm to spot patterns of words and images that indicate critical problems in particular neighborhoods. Much of this information will come from sensors—video cameras, instruments that detect motion or consumption, and devices that spot anomalies. Mobile phones will also be used as sensors that help us understand the movements of people. But mayors will also be able to measure the financial, material, and knowledge resources they put into a system and the results they get from those investments. And they’ll be able to accurately predict the effects of policies and actions they’re considering. Expertise With the help of cognitive systems, we will be able to see the big picture and make better decisions. This is especially important when we’re trying to address problems that cut across intellectual and industrial domains. For instance, police will be able to gather crime statistics and combine them with information about demographics, events, blueprints, economic activity, and weather to produce better analysis and safer cities. Armed with abundant data, police chiefs will be able to set strategies and deploy resources more effectively—even predicting where and when
  • 14. A New Era of Computing 13 crimes are likely to happen. Patrol officers will gain a wealth of information about locations they’re approaching. Situational intelligence will be extremely useful when they’re about to knock on the door of an apartment. The ability to achieve such comprehensive understanding of situations at every level will be an essential tool and will become one of the most important factors in the economic growth and competitiveness of cities. Objectivity We all possess biases based on our personal experiences, egos, and intuition about what works and what doesn’t, as well as the influence group dynamics. Cognitive systems can make it possible for us to be more objective in our decision making. Corporations may evolve into “conscious organizations” made up of humans and systems in collaboration. Sophisticated analytic engines will understand how an organization works, the dynamics of its competitive environment, and the capabilities within the organization and ecosystem of partners. Computers might take notes at meetings, convert data to graphic images, spot hard-tosee connections, and help guide individuals in achieving business goals. Imaginations Because of our prejudices, we have difficulty envisioning things that are dramatically different than what we’re familiar with. Cognitive systems will help us discover and explore new and contrarian ideas. A chemical or pharmaceutical company’s research-and-development team might use a cognitive system to explore combinations of molecules or even individual atoms that have not been contemplated before. Programs run on high-
  • 15. 14 Smart Machines performance computers will simulate the effects of the combinations, spotting potentially valuable new materials or drugs and also anticipating negative side effects. With the aid of cognitive machines, researchers and engineers will be able to explore millions of combinations in ways that are economical with both time and money. Senses We can only take in and make sense of so much raw, physical information. With cognitive systems, computer sensors teamed with analytics software will vastly extend our ability to gather and process such information. Imagine a world where individuals carry their own personal Watson in the form of a handheld device. These personal cognitive assistants will carry on conversations with us that will make the speech technology in today’s smartphones seem like baby talk. They will acquire knowledge about us, in part, from observing what we see, say, touch, and type on our electronic devices—so they can better anticipate our wishes. In addition, the assistant will be able to use sophisticated sensing to monitor a person’s health and threats to her well-being. If there’s carbon monoxide or the influenza virus in a room, for example, the device will alert its user. Over time, humans have evolved to be more successful as a species. We continually adapt to overcome our limitations. This partnership with computers is simply the latest step in a long process of adaptation. The uses for cognitive computing will be nearly limitless—a veritable playground for the human imagination. Think of any activity that involves a lot of complexity, many variables, a great deal of uncertainty, and incomplete information and that requires a high level of expertise and rapid decision making. That activity is going to be a fat target for cognitive technologies. Just as the
  • 16. A New Era of Computing 15 personal computer, the Internet, mobile communications, and social networking have given rise to tens of thousands of software applications, Web services, and smartphone apps, the cognitive era will produce a similar explosion of creativity. Think of the coming technologies as cognitive apps. For enterprises, you can easily envision apps for handling mergers and acquisitions, crisis management, competitive analysis, and product design. Picture a team within a company that’s in charge of sizing up acquisition candidates using a cognitive M&A app. In order to augment its understanding of potential targets, many of which are private companies, the team will set up a special social network of employees, customers, and business partners who have had direct experiences with other companies in the same industry. The links and the nature of the interactions will all be mapped out. A cognitive system will find information stored there, gathering insights about companies and suggesting acquisition targets. The M&A team will also track the performance of previously acquired companies, finding what worked and what didn’t. Those insights, constantly updated in the learning system, will help the team identify risks and synergies, helping it decide which acquisitions to pursue. Moreover, in the everyday lives of individuals, cognitive apps will help in selecting a college, making investment decisions, choosing among insurance options, and purchasing a car or home. Technology Necessities Breakthroughs: Opportunities and Much of the progress in science and technology comes in small increments. Scientists and engineers build on top of the innovations that came before. Consider the tablet computer. The first such devices appeared on the scene back in the 1980s. They had large, touch-sensitive screens but weighed nearly five pounds
  • 17. 16 Smart Machines and were an inch and a half thick. They were more like bricks than books, and about all you could do with them was scrawl brief memos and fill out forms. After thirty years of gradual improvements, we have slim, light, powerful tablets that combine the features of a telephone, a personal computer, a television, and more. There’s nothing wrong with incremental innovation. It’s absolutely necessary, and, sometimes, its results are both delightful and transformational. A prime example is the iPhone. With its superior navigation and abundance of easy-to-use applications, this breakthrough product spawned a burst of smartphone innovation, which combined with the socialnetworking phenomenon to produce a revolutionary shift in global human behavior. Yet, technologically, iPhone was built on top of many smartphone advances that preceded it. In fact, IBM introduced the first data-accessing phone, called Simon, in 1994, long before the term “smartphone” had been coined. New waves of progress, however, require majorly disruptive innovations—things like the transistor, the microchip, and the first programmable computers. These are the advances that fundamentally change our world. Today, many of the core technologies that provide the basic functions for traditional computers are mature; they have been in use for decades. In some cases, each wave of improvements is less profound than the wave that preceded it. We’re reaching the point of diminishing returns. Yet, at the same time, the demands on computing technology are growing exponentially. One example of a maturing technology is microchips. These slivers of silicon densely packed with tiny transistors replaced the vacuum tube. Early on, they brought consumers digital watches and pocketsized radios. Today, a handful of chips provide all the data processing required in a tablet or a data-center computer that
  • 18. A New Era of Computing 17 serves up tens of thousands of Facebook pages per day. Yet the basic concept of a microchip is essentially the same as it was forty years ago. They’re made of tiny transistors—switches that turn on and off to create the zeros and ones required to perform calculations and process information. The more transistors on a chip, the more data can be processed and stored there. The problem is that with each new generation of chips, it becomes more difficult to shrink the wires and transistors and pack more of them onto chips. So what’s needed is a disruptive innovation or, more likely, several of them that will change the game in the microchip realm and launch another period of rapid innovation. Soon incremental innovation will no longer be sufficient. People who demand the most from computers are already running into the limits of today’s circuitry. Michel McCoy, director of the Simulation and Computing Program at the U.S. Lawrence Livermore National Laboratory, is among those calling for a nationwide initiative involving national laboratories and businesses aimed at coming up with radical new approaches to microprocessor and computer-system design and software programming. “In a sense, everything we’ve done up until this point has been easy,” he says. “Now we have reached a physicsdominated threshold in the design of microprocessors and computing systems which, unless we do something about it, is essentially going to stagnate progress.”7 We need more radical innovations. In the years ahead, a number of fundamental advances in science and technology will be required to make progress. Think of those colorful Russian wooden dolls where progressively smaller dolls nest inside slightly larger ones. We need to achieve technology advances in layers. The top layer is the way we interact with computers and get them to do what we want. The big innovation at this outer layer 7. Michel McCoy, Lawrence Livermore National Laboratory, interview, June 14, 2012.
  • 19. 18 Smart Machines is “learning systems,” which we will explore deeply in chapter 2. The goal is to create machines that do not require as much programming by humans. Instead they’ll be “taught” by people, who will set objectives for them. As they learn, the machines will work out the details of how to meet those goals. The next layer represents how we organize and interpret data, which we’ll discuss in chapter 3. Today’s databases do an excellent job of organizing information in columns and rows; tomorrow’s are being designed to manage huge volumes of different kinds of data, understand information in context, and crunch data in real time. Another major dimension, which we’ll go into in chapter 4, is how to make use of data gathered through sensor technology. Today, we use rudimentary sensor technologies to perform useful tasks such as locating leaks in water systems. In the cognitive era, sensors and pattern-recognition software will augment our senses, making us hyper-aware of the world around us. The next layer represents the design of systems—how we fit together all the physical components that make up a computer. The challenge here, which we address in chapter 5, is creating data-centric computers. The designers of computing systems have long treated logic and memory as separate elements. Now, they will meld the components together, first, on circuit boards and, later, on single microchips. Also, they’ll move the processing to the data, rather than visa versa. Finally, in the innermost layer is nanotechnology, where we manipulate matter at the molecular and atomic scale. In chapter 6, we’ll explore what it will take to invent a new physics of computing. To overcome the limits of today’s microchip technology, scientists must shift to new nanomaterials and new approaches to switching from one digital state to another.
  • 20. A New Era of Computing 19 Possibilities include harnessing quantum mechanics or chips driven by “synapses and neurons” for data processing. A New Culture of Innovation We’re still in the early stages of the emergence of this new era of computing. Progress will require a willingness to make big bets, take a long-term view, and engage in open collaboration. We’ll explore the elements of the culture of innovation in each of the subsequent chapters in what we call the journeys of discovery. An absolutely critical aspect of the culture of innovation will be the ambition and capabilities of the inventors themselves. For rapid progress to be made in the new era of computing, young people must be inspired to become scientists, and they must be educated by teachers using superior tools and techniques. They have to be rewarded and given opportunities to challenge everything we think we know about how the world works. It requires dedication and investment by all of society’s institutions, including families, local communities, governments, universities, and businesses. When we ask scientists at IBM Research what motivates them, the answer is often that they want to change the world—not in minuscule increments but in great leaps forward. Dr. Mark Ritter, a senior manager in IBM Research’s Physical Sciences Department, leads an effort to rethink the entire architecture of computing for the era of cognitive systems inspired by the human brain. As a child, Mark, whose father was a plumber, had an intense curiosity about how things work on a fundamental level. It was his good fortune that his grandparents, who lived near his family in Grinnell, Iowa, had two neighbors who were physics professors at Grinnell College. One of the physicists, whom Mark pestered with science questions while the neighbor repaired his VW in the driveway, lent Mark a book on particle physics when
  • 21. 20 Smart Machines he was about twelve years old. As a teenager, Mark bicycled over to the campus to attend physics lectures. He built a simple gas laser in the basement of his home. It was the beginning of decades of inquiry into how things work and how they can work better. A few years ago, after more than twenty years in IBM Research, Mark and his colleagues recognized that the computing model designed for mid-twentieth-century demands was running out of gas. So they set about inventing a new one. “This is the most exciting time in my career,” Mark says. “The old ways of doing things aren’t going to solve efficiently the big, real-world problems we face.” For his part, Dr. Larry Norton of Memorial Sloan-Kettering Cancer Center is driven to transform the way medicine is practiced. Ever since he can remember, he was motivated by the desire to do something with his life that would improve the world. Born in 1947, he grew up at a time when people saw science as a powerful means of solving humanity’s problems. He recalls a thought-crystallizing experience when he was an undergraduate at the University of Rochester. He lived in a dorm where students often gathered in the mornings for freewheeling discussions of politics, values, and ethics. He was already contemplating a career in medicine, and the topic that day was, if you were a doctor and had done everything medical science could offer to save a patient but she died anyway, how would you feel? The students were split. “I realized I would feel terrible about it,” he says. “Offering everything available isn’t enough. I should have done better. And since, because of limitations in the world’s knowledge, I couldn’t do better, I should be involved in moving things forward.” During his forty-year career, Larry has been an innovator in cancer treatment. Among his contributions is the central role he played in developing the Norton-Simon hypothesis, a oncerevolutionary but now widely used approach to chemotherapy.
  • 22. A New Era of Computing 21 In Larry’s years as a clinician, he has saved the lives of many patients, but, of course, some of his patients have died. Those deaths haunt him. He believes that he owes it to them and to their children to improve the treatment of cancer and, ultimately, to help eradicate the disease. He sees his work with Watson as another way to contribute. Through a machine, he can share his knowledge and expertise with other physicians. He can help save cancer victims from a distance—people he has never met. You will meet a host of innovators in this book. There’s Dharmendra Modha, the IBM Research manager who is leading a team of IBM and university scientists in a quest to mimic the human brain. And there’s Murali Ramanathan, a professor of pharmaceutical sciences and neurology at State University of New York, Buffalo, who is studying the role of genetic and environmental factors in multiple sclerosis. These innovators are remarkable people. We depend on them to produce the surprising advances that knock the world off kilter and, ultimately, have the potential to make it a better place. We will need many of them to make the transition to the era of cognitive systems. In the end, this era is not about machines but about the people who design and use them. Smart Machines Copyright © 2013 Columbia University Press This teaser was produced using PressBooks.com, and PDF rendering was done by PrinceXML.