Neural Networks: A Systematic IntroductionNeural networks are a computing paradigm that is finding increasing attention among computer scientists. In this book, theoretical laws and models previously scattered in the literature are brought together into a general theory of artificial neural nets. Always with a view to biology and starting with the simplest nets, it is shown how the properties of models change when more general computing elements and net topologies are introduced. Each chapter contains examples, numerous illustrations, and a bibliography. The book is aimed at readers who seek an overview of the field or who wish to deepen their knowledge. It is suitable as a basis for university courses in neurocomputing. |
Contents
3 | |
5 | |
2 | 24 |
Perceptron Learning | 79 |
6 | 121 |
The Backpropagation Algorithm | 149 |
Fast Learning Algorithms | 185 |
9 | 227 |
Fuzzy Logic | 287 |
Associative Networks | 309 |
The Hopfield Model | 335 |
Stochastic Networks 371 | 370 |
Kohonen Networks | 389 |
Modular Neural Networks | 411 |
Genetic Algorithms 427 | 426 |
Hardware for Neural Networks | 449 |
Other editions - View all
Common terms and phrases
action potential approximation artificial neural networks Assume automata axon backpropagation backpropagation algorithm binary biological Boltzmann machine Boolean cell membrane cluster coding complex components computing units connections constant convergence corresponds defined edges energy function equation error function example feed-forward finite function F genetic algorithms given gradient Hebbian learning hidden layer hidden units Hopfield network implement input space input vector ionic channels ions iteration kind Kohonen network learning algorithm learning problem linear associator linear separation logical functions McCulloch-Pitts units method minimal n-dimensional neocognitron neurons node operators optimal output unit parameters pattern perceptron perceptron learning pixel plane points polynomial possible primitive functions processor produce pseudoinverse quadratic receptive field recursive regions result retina shown in Figure shows sigmoid signals simulated solution step stochastic strings synapse training set transformed transmitted unsupervised learning update variables w₁ weight matrix weight space weight vector zero