Literature/1982/Hopfield
Authors | ||
---|---|---|
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z & |
[[Category:Authors {{{2}}}|Hopfield]]
Significance
edit
In 1982, physicist John Hopfield was able to prove that a form of neural network (now called a " Hopfield net") could learn and process information in a completely new way. Around the same time, David Rumelhart popularized a new method for training neural networks called " backpropagation" (discovered years earlier by Paul Werbos). These two discoveries revived the field of connectionism which had been largely abandoned since 1970. The new field was unified and inspired by the appearance of Parallel Distributed Processing in 1986—a two volume collection of papers edited by Rumelhart and psychologist James McClelland. Neural networks would become commercially successful in the 1990s, when they began to be used as the engines driving programs like optical character recognition and speech recognition. |