Saturday, November 5, 2011

Inventing the Future of Computing


In a windowless room deep inside IBM’s Almaden Research Center in San Jose, scientists are teaching a computer chip to learn from what it sees, much like a human. The effort is paying off, if performance at Pong is any measure.

When the chip, part of a project called SyNAPSE, first learned to play the classic videogame in March, it did poorly. Weeks later, the company reports, it was nearly unbeatable.

The SyNAPSE chip was designed to learn through experience, find correlations, create hypotheses, and remember outcomes. As chips such as the one from SyNAPSE become smarter and smaller, it will be possible to embed them in everyday objects. That portends a future in which the interaction between computer and user is far more natural and ubiquitous. “Computers were originally designed to solve math problems and that’s what they’re really good at—symbolic computation,” says Steve Esser, one of three scientists teaching the SyNAPSE chip. “Anything that involves visual processing, auditory processing, or speech processing—they can do it, but they’re just not very good at it.”

IBM is the third-biggest spender on research and development among US tech companies, having invested $6.28 billion in the past 12 months, according to Bloomberg data. Along with Microsoft, the top spender at $9.18 billion, and Intel, at $7.71 billion, these technology bellwethers are investing in what they see as computing’s next wave.

MORE INNOVATION THAN EVER

“Computing is undergoing the most remarkable transformation since the invention of the PC,” said Intel Chief Executive Officer Paul Otellini during his company’s developer conference in September. “The innovation of the next decade is going to outstrip the innovation of the past three combined.”
At Microsoft, researchers are looking at the ways in which users will interact with computers in 3D spaces. In September the company announced a project reminiscent of Star Trek‘s Holodeck, a simulated reality room where people could interact naturally with virtual objects and other individuals.
Microsoft’s effort, called Holodesk, is a like a mini-Holodeck for the office desk, which lets workers interact and manipulate virtual 3D images.
It uses an Xbox Kinect camera and an optical transparent display to give people the illusion that they’re interacting with 3D graphics. For example, a user can juggle virtual balls or hold a virtual prototype of a smartphone. The Holodesk is part of Microsoft’s pioneering work in what it calls natural user interfaces.

It looks at how people will interact with computers when computing power is everywhere and not limited to a PC. “There is this real sense that this is a dramatic new trend for the industry and for Microsoft,” said Steve Clayton, who writes about internal research for the company blog, “Next at Microsoft.” “We’ve been investing a lot over this vision.”

At Intel, the world’s biggest chipmaker, Brian David Johnson spends quite a bit of time thinking about the future—the year 2020, to be precise. In fact, the futurist recently participated in a conference call about building Intel’s 2020 CPU.

As chips become embedded in many different devices, the company has realised that it needs to change.


ALGORITHMS THAT GRASP HUMANITY

“Fast and less-expensive and smaller isn’t enough anymore; we really need
to have an understanding of what we’re going to do with it,” says Johnson, who travels the world talking to people about how they envision the future.
“To be a human in 2020, it will begin to feel like data is taking on a life of its own,” he says. The proliferation of computing into everyday objects will generate massive quantities of sensor and other data, with algorithms talking to algorithms and machines talking to machines, he adds. “That algorithm—that thing that processes that massive amount of data—will need to have an understanding of what it means to be human.”

No comments:

Post a Comment

social media buttons