Learn Like a Human

March 20, 2007

For 50 years, computer scientists have been trying to make computers intelligent while mostly ignoring the one thing that is intelligent: the human brain.

For the past five years, Jeff Hawkins has been working on computer intelligence. Better known as the co-founder of Palm Computing and Handspring, Hawkins has been interested in the similarities and differences between brains and computers for over two decades.

In 2002, with the encouragement of some neuroscientist friends, he created the Redwood Neuroscience Institute. For three years, he worked with about 10 other scientists there on all aspects of neocortical anatomy, physiology, and theory. More than 100 other scientists visited RNI.

By 2004 Hawkins had developed and published a theory of what has come to be called Hierarchical Temporal Memory, but it was still rooted in biology. He wasn't able to turn the biological theory into a practical technology until a colleague, Dileep George, showed how HTM could be modeled on a type of Bayesian network.

George's prototype application was a vision system that recognized line drawings of 50 different objects, independent of size, position, distortion, and noise. Although it wasn't designed to solve a practical problem, it did things no other existing vision system could do.

In 2005, with a theory of the neocortex, a mathematical expression of that theory, and a working prototype, Hawkins created a start-up, Numenta, in Menlo Park, Calif. Numenta has created a software platform that allows anyone to build HTMs for experimentation and deployment. You don't program an HTM as you would a computer; rather you configure it with software tools, then train it by exposing it to sensory data. HTMs thus learn in much the same way that children do.

Carver Mead once said, "If we really understand a system we will be able to build it." Hawkins has built and tested enough HTMs of sufficient complexity to believe that they work, for at least some difficult and useful problems, such as handling distortion and variances in visual images.

The software development toolset that Numenta has released will allow scientists and developers to go much further. Already, researchers in industry, academia, and government are considering how to use HTMs to solve problems in data-rich areas like oil exploration and drug discovery. HTMs may be able to solve such classic problems as speech and visual pattern recognition, meteorology, and financial analysis.

Source: IEEE Spectrum Magazine

Explore further: The future cometh: Science, technology and humanity at Singularity Summit 2011 (Part II)

Related Stories

Recommended for you

Researchers engineer a tougher fiber

February 22, 2019

North Carolina State University researchers have developed a fiber that combines the elasticity of rubber with the strength of a metal, resulting in a tougher material that could be incorporated into soft robotics, packaging ...

A quantum magnet with a topological twist

February 22, 2019

Taking their name from an intricate Japanese basket pattern, kagome magnets are thought to have electronic properties that could be valuable for future quantum devices and applications. Theories predict that some electrons ...

Solving the jet/cocoon riddle of a gravitational wave event

February 22, 2019

An international research team including astronomers from the Max Planck Institute for Radio Astronomy in Bonn, Germany, has combined radio telescopes from five continents to prove the existence of a narrow stream of material, ...


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.