Artificial Neural Networks: Learning Algorithms, Performance Evaluation, and ApplicationsSpringer Science & Business Media, 29 июн. 2013 г. - Всего страниц: 440 1.1 Overview We are living in a decade recently declared as the "Decade of the Brain". Neuroscientists may soon manage to work out a functional map of the brain, thanks to technologies that open windows on the mind. With the average human brain consisting of 15 billion neurons, roughly equal to the number of stars in our milky way, each receiving signals through as many as 10,000 synapses, it is quite a view. "The brain is the last and greatest biological frontier", says James Weston codiscoverer of DNA, considered to be the most complex piece of biological machinery on earth. After many years of research by neuroanatomists and neurophys iologists, the overall organization of the brain is well understood, but many of its detailed neural mechanisms remain to be decoded. In order to understand the functioning of the brain, neurobiologists have taken a bottom-up approach of studying the stimulus-response characteristics of single neurons and networks of neurons, while psy chologists have taken a top-down approach of studying brain func tions from the cognitive and behavioral level. While these two ap proaches are gradually converging, it is generally accepted that it may take another fifty years before we achieve a solid microscopic, intermediate, and macroscopic understanding of brain. |
Содержание
9 | |
13 | |
Appendix A2 2 | 76 |
Efficient LEarning Algorithms | 86 |
Fast Learning Algorithms | 141 |
Algorithms for Learning | 194 |
Performance Evaluation of Singlelayered | 219 |
Highorder Neural Networks | 259 |
Future Trends and Directions | 371 |
413 | |
435 | |
Другие издания - Просмотреть все
Часто встречающиеся слова и выражения
adaptation cycle analog output approximation associations yk Back Propagation algorithm binary output Boltzmann machine c₁ capacity ratio composite key patterns convergence corresponding defined Delta Rule eigenvalue elements Error Back Propagation evaluation Fast Back Propagation Fast ELEANNE feed-forward neural networks first-order neural network gradient descent Hamming distance Hessian matrix hidden layer hidden units Hopfield initial adaptation cycles input patterns j₁ Kohonen Kronecker product layer of hidden learning algorithms lower network Lyapunov function matrix of synaptic minimizing multi-layered neural network n₁ networks of order neural networks trained neuron nonlinear number of adaptation number of association number of hidden objective function optimal matrix outer-product rule output units pollution sources recursive signum function single-layered neural networks stored patterns synaptic weights synaptic weights wpq tion trained neural network trained with respect update equation Yi,k ŷp,k ΣΣ