Inspired by the architecture of the brain, scientists
have developed a new kind of computer chip that uses no more power than a
hearing aid and may eventually excel at calculations that stump today’s supercomputers.The chip, or processor, is named TrueNorth and was
developed by researchers at IBM and detailed in an article published on Thursday in the
journal Science. It tries to mimic the way brains recognize patterns, relying
on densely interconnected webs of transistors similar to the brain’s neural
networks.
The chip’s electronic “neurons” are able to signal others
when a type of data — light, for example — passes a certain threshold. Working
in parallel, the neurons begin to organize the data into patterns suggesting
the light is growing brighter, or changing color or shape.
The processor may thus be able to
recognize that a woman in a video is picking up a purse, or control a robot
that is reaching into a pocket and pulling out a quarter. Humans are able to
recognize these acts without conscious thought, yet today’s computers and
robots struggle to interpret them.
The chip contains 5.4 billion transistors, yet draws
just 70 milliwatts of power. By contrast, modern Intel processors in today’s
personal computers and data centers may have 1.4 billion transistors and
consume far more power — 35 to 140 watts.
"Today’s conventional microprocessors and graphics
processors are capable of performing billions of mathematical operations a
second, yet the new chip system clock makes its calculations barely a thousand
times a second. But because of the vast number of circuits working in parallel,
it is still capable of performing 46 billion operations a second per watt of
energy consumed, according to IBM researchers".
The TrueNorth has one million “neurons,” about as complex
as the brain of a bee.
“It is a remarkable achievement in terms of scalability
and low power consumption,” said Horst Simon, deputy director of the Lawrence Berkeley National
Laboratory.
He compared the new design to the advent of parallel
supercomputers in the 1980s, which he recalled was like moving from a two-lane
road to a superhighway.
The new approach to design, referred to variously as
neuromorphic or cognitive computing, is still in its infancy, and the IBM chips
are not yet commercially available. Yet the design has touched off a vigorous
debate over the best approach to speeding up the neural networks increasingly
used in computing.
The idea that neural networks might be useful in
processing information occurred to engineers in the 1940s, before the invention
of modern computers. Only recently, as computing has grown enormously in memory
capacity and processing speed, have they proved to be powerful computing tools.
In recent years, companies including
Google, Microsoft and Apple have turned to pattern recognition driven by neural
networks to vastly improve the quality of services like speech recognition and
photo classification.
More on the report here
No comments
Post a Comment