Principles of neural information theory
From Thermodynamics of Computation
- reference groups
- Review Articles and Books
- author-supplied keywords
- keywords
- authors
- James V Stone
- title
- principles of neural information theory
- year
- 2018
- pages
- 200
- publisher
- sebtel press
- abstract
- The brain is the most complex computational machine known to science, even though its components (neurons) are slow and unreliable compared to a laptop computer. In this richly illustrated book, Shannon's mathematical theory of information is used to explore the metabolic efficiency of neurons, with special reference to visual perception. Evidence from a diverse range of research papers is used to show how information theory defines absolute limits on neural efficiency; limits which ultimately determine the neuroanatomical microstructure of the eye and brain. Written in an informal style, with a comprehensive glossary, tutorial appendices, explainer boxes, and a list of annotated Further Readings, this book is an ideal introduction to cutting-edge research in neural information theory.
Counts
- Citation count
- Page views
- 8
Identifiers
- isbn: 978-0993367922
- websites: http://jim-stone.staff.shef.ac.uk/BookNeuralInfo/NeuralInfoMain.html