APS March Meeting 2023
Volume 68, Number 3
Las Vegas, Nevada (March 5-10)
Virtual (March 20-22); Time Zone: Pacific Time
Session S02: Stochastic Thermodynamics of Biological and Artificial Information Processing
8:00 AM–10:48 AM,
Thursday, March 9, 2023
Room: Room 125
Sponsoring
Unit:
GSNP
Chair: Martin van Hecke, AMOLF Amsterdam & Leiden University; Marc Serra
Abstract: S02.00001 : Extending computational complexity theory to include thermodynamic resource costs*
8:00 AM–8:36 AM
Abstract
Presenter:
David Wolpert
(Santa Fe Institute)
Author:
David Wolpert
(Santa Fe Institute)
The central concern of computational complexity theory is how the minimal "resource costs" needed to perform a given computation on a given type of computational machine depend on the size of the input. One of the most common choices for resource costs is the number of iterations the computation requires to complete. The canonical example is the classes P and NP, which are concerned with how the number of iterations it takes a Turing machine to perform a computation depends on the size of the input. However, in the real world, some of the most important resource costs of performing a computation are thermodynamic, e.g., the amount of free energy required to perform the computation and the amount of heat it produces. In this talk I will summarize recent results on how thermodynamic resource costs depend on the computational machine that is used and on the computation being performed. I will start by reviewing some results concerning the thermodynamic costs of performing a given computation in a (loop-free and branch-free) digital circuit, and on a Turing machine (TM - implemented as the associated `single-iteration' partial function). In particular, I will review results on how considering the minimal entropy production (EP) of computing a desired output on a TM, rather than the minimal size of an input string that causes the TM to produce that output (the output's Kolmogorov complexity), results in a correction term to Kolmogorov complexity. After this review I will summarize a set of new results concerning determinisitc finite automata (DFA). The first of these results are derived in an inclusive Hamiltonian framework, and relate the minimal EP required by a DFA to recognize a language to the state complexity of that DFA. The other results I will review are formulated in a continuous-time-Markov chain framework. The first of these results concern the unavoidable mismatch cost contribution to EP that arises if the DFA's rate matrix is periodic, undergoing the same trajectory repeatedly for each iteration. The second set of results are stopping time fluctuation theorems giving total EP generated by a DFA by the time it reaches its halt state. I will end by describing the vast new set of research issues at the intersection of stochastic thermodynamics and computer science theory, issues that expand both fields.
*I would like to acknowledge support from the Santa Fe Institute and the US National Science Foundation under grant 2221345