Santa Fe Institute Collaboration Platform

Thermodynamics of Computation

Property:Abstract

From Thermodynamics of Computation

This is a property of type Text.

Showing 6 pages using this property.
I
It is argued that computing machines inevitably involve devices which perform logical functions that do not have a single-valued inverse. This logical irreversibility is associated with physical irreversibility and requires a minimal heat generation, per machine cycle, typically of the order of kT for each irreversible function. This dissipation serves the purpose of standardizing signals and making them independent of their exact logical history. Two simple, but representative, models of bistable devices are subjected to a more detailed analysis of switching kinetics to yield the relationship between speed and energy dissipation, and to estimate the effects of errors induced by thermal fluctuations.  +
Originally developed by Claude Shannon in the 1940s, information theory laid the foundations for the digital revolution, and is now an essential tool in telecommunications, genetics, linguistics, brain sciences, and deep space communication. In this richly illustrated book, accessible examples are used to introduce information theory in terms of everyday games like ‘20 questions’ before more advanced topics are explored. Online MatLab and Python computer programs provide hands-on experience of information theory in action, and PowerPoint slides give support for teaching. Written in an informal style, with a comprehensive glossary and tutorial appendices, this text is an ideal primer for novices who wish to learn the essential principles and applications of information theory.  +
H
Recently, it has been shown that for out-of-equilibrium systems, there are additional constraints on thermodynamical evolution besides the ordinary second law. These form a new family of second laws of thermodynamics, which are equivalent to the monotonicity of quantum Rényi divergences. In black hole thermodynamics, the usual second law is manifest as the area increase theorem. Hence one may ask if these additional laws imply new restrictions for gravitational dynamics, such as for out-of-equilibrium black holes? Inspired by this question, we study these constraints within the AdS/CFT correspondence. First, we show that the Rényi divergence can be computed via a Euclidean path integral for a certain class of excited CFT states. Applying this construction to the boundary CFT, the Rényi divergence is evaluated as the renormalized action for a particular bulk solution of a minimally coupled gravity-scalar system. Further, within this framework, we show that there exist transitions which are allowed by the traditional second law, but forbidden by the additional thermodynamical constraints. We speculate on the implications of our findings.  +
P
The brain is the most complex computational machine known to science, even though its components (neurons) are slow and unreliable compared to a laptop computer. In this richly illustrated book, Shannon's mathematical theory of information is used to explore the metabolic efficiency of neurons, with special reference to visual perception. Evidence from a diverse range of research papers is used to show how information theory defines absolute limits on neural efficiency; limits which ultimately determine the neuroanatomical microstructure of the eye and brain. Written in an informal style, with a comprehensive glossary, tutorial appendices, explainer boxes, and a list of annotated Further Readings, this book is an ideal introduction to cutting-edge research in neural information theory.  +
A
We present a KL-control treatment of the fundamental problem of erasing a bit. We introduce notions of "reliability" of information storage via a reliability timescale '"`UNIQ--math-00000CDB-QINU`"', and "speed" of erasing via an erasing timescale '"`UNIQ--math-00000CDC-QINU`"'. Our problem formulation captures the tradeoff between speed, reliability, and the Kullback-Leibler (KL) cost required to erase a bit. We show that rapid erasing of a reliable bit costs at least '"`UNIQ--math-00000CDD-QINU`"', which goes to '"`UNIQ--math-00000CDE-QINU`"' when '"`UNIQ--math-00000CDF-QINU`"'.  +
We present a Kullback-Leibler (KL) control treatment of the fundamental problem of erasing a bit. We introduce notions of '"`UNIQ--math-00000CE1-QINU`"' of information storage via a reliability timescale '"`UNIQ--math-00000CE2-QINU`"', and speed of erasing via an erasing timescale '"`UNIQ--math-00000CE3-QINU`"'. Our problem formulation captures the tradeoff between speed, reliability, and the Kullback-Leibler (KL) cost required to erase a bit. We show that rapid erasing of a reliable bit costs at least '"`UNIQ--math-00000CE4-QINU`"', which goes to '"`UNIQ--math-00000CE5-QINU`"' when '"`UNIQ--math-00000CE6-QINU`"'.  +