Hunch seeds: Info bio

post by the gears to ascension (lahwran) · 2023-02-17T21:25:58.422Z · LW · GW · 0 comments

Contents

  Michael levin unfinished-research talk
  The cost of information acquisition by natural selection
  Cellular sentience as the primary source of biological order and evolution
  Semantic information, autonomous agency and non-equilibrium statistical physics
  Subjective Information and Survival in a Simulated Biological System
  more misc results
None
No comments

Just some stuff I found online. Not sure that all of these are actually good, but they seem promising as hunch inspiration. As usual, I'm sharing this because I am not going to be able to be the brain who uses these things to come to non-crackpot conclusions, but people who don't have enough hunch seeding from this area of thought probably are missing insights, in my view.

Previous posts in this sequence are highly related to these topics, especially the call for submissions.

Michael levin unfinished-research talk

He's been posting quite a few!

The cost of information acquisition by natural selection

Natural selection enriches genotypes that are well-adapted to their environment. Over successive generations, these changes to the frequencies of types accumulate information about the selective conditions. Thus, we can think of selection as an algorithm by which populations acquire information about their environment. Kimura (1961) pointed out that every bit of information that the population gains this way comes with a minimum cost in terms of unrealized fitness (substitution load). Due to the gradual nature of selection and ongoing mismatch of types with the environment, a population that is still gaining information about the environment has lower mean fitness than a counter-factual population that already has this information. This has been an influential insight, but here we find that experimental evolution of Escherichia coli with mutations in a RNA polymerase gene (rpoB) violates Kimura’s basic theory. To overcome the restrictive assumptions of Kimura’s substitution load and develop a more robust measure for the cost of selection, we turn to ideas from computational learning theory. We reframe the ‘learning problem’ faced by an evolving population as a population versus environment (PvE) game, which can be applied to settings beyond Kimura’s theory – such as stochastic environments, frequency-dependent selection, and arbitrary environmental change. We show that the learning theoretic concept of ‘regret’ measures relative lineage fitness and rigorously captures the efficiency of selection as a learning process. This lets us establish general bounds on the cost of information acquisition by natural selection. We empirically validate these bounds in our experimental system, showing that computational learning theory can account for the observations that violate Kimura’s theory. Finally, we note that natural selection is a highly effective learning process in that selection is an asymptotically optimal algorithm for the problem faced by evolving populations, and no other algorithm can consistently outperform selection in general. Our results highlight the centrality of information to natural selection and the value of computational learning theory as a perspective on evolutionary biology. 

https://www.semanticscholar.org/paper/The-cost-of-information-acquisition-by-natural-McGee-Kosterlitz/3e5c34e73362690d299399da20bb6050f8a5ea4c

 

Cellular sentience as the primary source of biological order and evolution

All life is cellular, starting some 4 billion years ago with the emergence of the first cells. In order to survive their early evolution in the face of an extremely challenging environment, the very first cells invented cellular sentience and cognition, allowing them to make relevant decisions to survive through creative adaptations in a continuously running evolutionary narrative. We propose that the success of cellular life has crucially depended on a biological version of Maxwell's demons which permits the extraction of relevant sensory information and energy from the cellular environment, allowing cells to sustain anti-entropic actions. These sensor-effector actions allowed for the creative construction of biological order in the form of diverse organic macromolecules, including crucial polymers such as DNA, RNA, and cytoskeleton. Ordered biopolymers store analogue (structures as templates) and digital (nucleotide sequences of DNA and RNA) information that functioned as a form memory to support the development of organisms and their evolution. Crucially, all cells are formed by the division of previous cells, and their plasma membranes are physically and informationally continuous across evolution since the beginning of cellular life. It is argued that life is supported through life-specific principles which support cellular sentience, distinguishing life from non-life. Biological order, together with cellular cognition and sentience, allow the creative evolution of all living organisms as the authentic authors of evolutionary novelty.

(closed article, can't read; looks like it's not on scihub. looks very interesting anyhow.) https://www.sciencedirect.com/science/article/abs/pii/S0303264722000818 

 

Semantic information, autonomous agency and non-equilibrium statistical physics

Shannon information theory provides various measures of so-called syntactic information, which reflect the amount of statistical correlation between systems. By contrast, the concept of ‘semantic information’ refers to those correlations which carry significance or ‘meaning’ for a given system. Semantic information plays an important role in many fields, including biology, cognitive science and philosophy, and there has been a long-standing interest in formulating a broadly applicable and formal theory of semantic information. In this paper, we introduce such a theory. We define semantic information as the syntactic information that a physical system has about its environment which is causally necessary for the system to maintain its own existence. ‘Causal necessity’ is defined in terms of counter-factual interventions which scramble correlations between the system and its environment, while ‘maintaining existence’ is defined in terms of the system's ability to keep itself in a low entropy state. We also use recent results in non-equilibrium statistical physics to analyse semantic information from a thermodynamic point of view. Our framework is grounded in the intrinsic dynamics of a system coupled to an environment, and is applicable to any physical system, living or otherwise. It leads to formal definitions of several concepts that have been intuitively understood to be related to semantic information, including ‘value of information’, ‘semantic content’ and ‘agency’. 

https://www.semanticscholar.org/paper/Semantic-information%2C-autonomous-agency-and-physics-Kolchinsky-Wolpert/daca9c48f63a289c27d55ab5837457bf60877e5b 

Subjective Information and Survival in a Simulated Biological System

Information transmission and storage have gained traction as unifying concepts to characterize biological systems and their chances of survival and evolution at multiple scales. Despite the potential for an information-based mathematical framework to offer new insights into life processes and ways to interact with and control them, the main legacy is that of Shannon’s, where a purely syntactic characterization of information scores systems on the basis of their maximum information efficiency. The latter metrics seem not entirely suitable for biological systems, where transmission and storage of different pieces of information (carrying different semantics) can result in different chances of survival. Based on an abstract mathematical model able to capture the parameters and behaviors of a population of single-celled organisms whose survival is correlated to information retrieval from the environment, this paper explores the aforementioned disconnect between classical information theory and biology. In this paper, we present a model, specified as a computational state machine, which is then utilized in a simulation framework constructed specifically to reveal emergence of a “subjective information”, i.e., trade-off between a living system’s capability to maximize the acquisition of information from the environment, and the maximization of its growth and survival over time. Simulations clearly show that a strategy that maximizes information efficiency results in a lower growth rate with respect to the strategy that gains less information but contains a higher meaning for survival. 

https://www.semanticscholar.org/paper/Subjective-Information-and-Survival-in-a-Simulated-Barker-Pierobon/b32cb42c3d7ec603e2e8b9180eb6613115c321c6 

 

more misc results

From https://metaphor.systems/search?q=mutual%20information%20%2C%20and%20its%20information%20theory%20relationship%20to%20friendliness%20between%20beings%20in%20an%20environment mildly filtered:

0 comments

Comments sorted by top scores.