Cart
Free US shipping over $10
Proud to be B-Corp

Neural Networks and Analog Computation Hava T. Siegelmann

Neural Networks and Analog Computation By Hava T. Siegelmann

Neural Networks and Analog Computation by Hava T. Siegelmann


$186.99
Condition - New
Only 2 left

Summary

The theoretical foundations of Neural Networks and Analog Computation conceptualize neural networks as a particular type of computer consisting of multiple assemblies of basic processors interconnected in an intricate structure.

Neural Networks and Analog Computation Summary

Neural Networks and Analog Computation: Beyond the Turing Limit by Hava T. Siegelmann

The theoretical foundations of Neural Networks and Analog Computation conceptualize neural networks as a particular type of computer consisting of multiple assemblies of basic processors interconnected in an intricate structure. Examining these networks under various resource constraints reveals a continuum of computational devices, several of which coincide with well-known classical models. On a mathematical level, the treatment of neural computations enriches the theory of computation but also explicated the computational complexity associated with biological networks, adaptive engineering tools, and related models from the fields of control theory and nonlinear dynamics. The material in this book will be of interest to researchers in a variety of engineering and applied sciences disciplines. In addition, the work may provide the base of a graduate-level seminar in neural networks for computer science students.

Neural Networks and Analog Computation Reviews

All of the three primary questions are considered: What computational models can the net simulate (within polynomial bounds)? What are the computational complexity classes that are relevant to the net? How does the net (which, after all, is an analog device) relate to Church's thesis? Moreover the power of the basic model is also analyzed when the domain of reals is replaced by the rationals and the integers.

-Mathematical Reviews

Siegelmann's book focuses on the computational complexities of neural networks and making this research accessible...the book accomplishes the said task nicely.

---SIAM Review, Vol. 42, No 3.

Table of Contents

1 Computational Complexity.- 1.1 Neural Networks.- 1.2 Automata: A General Introduction.- 1.2.1 Input Sets in Computability Theory.- 1.3 Finite Automata.- 1.3.1 Neural Networks and Finite Automata.- 1.4 The Turing Machine.- 1.4.1 Neural Networks and Turing Machines.- 1.5 Probabilistic Turing Machines.- 1.5.1 Neural Networks and Probabilistic Machines.- 1.6 Nondeterministic Turing Machines.- 1.6.1 Nondeterministic Neural Networks.- 1.7 Oracle Turing Machines.- 1.7.1 Neural Networks and Oracle Machines.- 1.8 Advice Turing Machines.- 1.8.1 Circuit Families.- 1.8.2 Neural Networks and Advice Machines.- 1.9 Notes.- 2 The Model.- 2.1 Variants of the Network.- 2.1.1 A System Diagram Interpretation.- 2.2 The Network's Computation.- 2.3 Integer Weights.- 3 Networks with Rational Weights.- 3.1 The Turing Equivalence Theorem.- 3.2 Highlights of the Proof.- 3.2.1 Cantor-like Encoding of Stacks.- 3.2.2 Stack Operations.- 3.2.3 General Construction of the Network.- 3.3 The Simulation.- 3.3.1 P-Stack Machines.- 3.4 Network with Four Layers.- 3.4.1 A Layout Of The Construction.- 3.5 Real-Time Simulation.- 3.5.1 Computing in Two Layers.- 3.5.2 Removing the Sigmoid From the Main Layer.- 3.5.3 One Layer Network Simulates TM.- 3.6 Inputs and Outputs.- 3.7 Universal Network.- 3.8 Nondeterministic Computation.- 4 Networks with Real Weights.- 4.1 Simulating Circuit Families.- 4.1.1 The Circuit Encoding.- 4.1.2 A Circuit Retrieval.- 4.1.3 Circuit Simulation By a Network.- 4.1.4 The Combined Network.- 4.2 Networks Simulation by Circuits.- 4.2.1 Linear Precision Suffices.- 4.2.2 The Network Simulation by a Circuit.- 4.3 Networks versus Threshold Circuits.- 4.4 Corollaries.- 5 Kolmogorov Weights: Between P and P/poly.- 5.1 Kolmogorov Complexity and Reals.- 5.2 Tally Oracles and Neural Networks.- 5.3 Kolmogorov Weights and Advice Classes.- 5.4 The Hierarchy Theorem.- 6 Space and Precision.- 6.1 Equivalence of Space and Precision.- 6.2 Fixed Precision Variable Sized Nets.- 7 Universality of Sigmoidal Networks.- 7.1 Alarm Clock Machines.- 7.1.1 Adder Machines.- 7.1.2 Alarm Clock and Adder Machines.- 7.2 Restless Counters.- 7.3 Sigmoidal Networks are Universal.- 7.3.1 Correctness of the Simulation.- 7.4 Conclusions.- 8 Different-limits Networks.- 8.1 At Least Finite Automata.- 8.2 Proof of the Interpolation Lemma.- 9 Stochastic Dynamics.- 9.1 Stochastic Networks.- 9.1.1 The Model.- 9.2 The Main Results.- 9.2.1 Integer Networks.- 9.2.2 Rational Networks.- 9.2.3 Real Networks.- 9.3 Integer Stochastic Networks.- 9.4 Rational Stochastic Networks.- 9.4.1 Rational Set of Choices.- 9.4.2 Real Set of Choices.- 9.5 Real Stochastic Networks.- 9.6 Unreliable Networks.- 9.7 Nondeterministic Stochastic Networks.- 10 Generalized Processor Networks.- 10.1 Generalized Networks: Definition.- 10.2 Bounded Precision.- 10.3 Equivalence with Neural Networks.- 10.4 Robustness.- 11 Analog Computation.- 11.1 Discrete Time Models.- 11.2 Continuous Time Models.- 11.3 Hybrid Models.- 11.4 Dissipative Models.- 12 Computation Beyond the Turing Limit.- 12.1 The Analog Shift Map.- 12.2 Analog Shift and Computation.- 12.3 Physical Relevance.- 12.4 Conclusions.

Additional information

NLS9781461268758
9781461268758
1461268753
Neural Networks and Analog Computation: Beyond the Turing Limit by Hava T. Siegelmann
New
Paperback
Springer-Verlag New York Inc.
2012-10-21
181
N/A
Book picture is for illustrative purposes only, actual binding, cover or edition may vary.
This is a new book - be the first to read this copy. With untouched pages and a perfect binding, your brand new copy is ready to be opened for the first time

Customer Reviews - Neural Networks and Analog Computation