The study estimates the energy costs of information processing in biological systems | Albiseyler

The study estimates the energy costs of information processing in biological systems

This article has been reviewed by Science X’s editorial process
and measure.
editors highlighted the following attributes while ensuring content credibility:

facts verified

peer reviewed publication

trusted source

to correct

Credit: Pixabay/CC0 Public Domain

x close

Credit: Pixabay/CC0 Public Domain

The behavior, physiology, and existence of living organisms is supported by myriad biological processes that involve communication between cells and other molecular components. These molecular components are known to transfer information to each other in various ways, for example through processes known as diffusion and electrical depolarization or by the exchange of mechanical waves.

Scientists from Yale University recently conducted a study aimed at calculating the energy costs of this transfer of information between cells and molecular components. Their work, published in Physical Review Letterspresents a new tool that could be used to study cellular networks and better understand their function.

“We’ve been thinking about this project in one form or another for some time,” Benjamin B. Machta, one of the researchers who conducted the study, told

“I first discussed the ideas that eventually morphed into this project with my PhD advisor Jim Sethna about ten years ago, but for various reasons the work never quite got off the ground. Sam and I started talking about it when we were thinking about how to understand the energy cost that biology must expend to compute—the subject of much of his PhD—and perhaps more broadly to ensure that its parts are coherent and controlled, and he figured out how those computations to carry out.”

Recent work by Machta and his colleague Samuel J. Bryant draws inspiration from earlier work published in the late 1990s, particularly the efforts of Simon Laughlin and his collaborators. At the time, this research group tried to experimentally determine how much energy neurons expend when sending information.

“Laughlin and colleagues found that this energy expenditure ranged between 104-107 TOBT/bit depending on the details, which is much higher than the ‘baseline’ limit of ~KBT/bit, sometimes called the Landauer bond, which you have to pay to erase a piece of information,” explained Machta.

“In some ways, we wanted to understand; was this an example of biology being just a waste? Or maybe there were other costs that had to be paid; specifically, the Landauer limit doesn’t mention geometry or physical details. The use of the Landauer bound is itself subtle , because it only pays to erase information, it’s possible to compute reversibly, never erase anything, and pay NO computation cost – but that’s not the focus here.”

Another goal of Machta and Bryant’s recent study was to see if optimizing these energy costs could shed light on why molecular systems communicate with each other using different physical mechanisms in different situations. For example, while neurons typically communicate with each other through electrical signals, other types of communication media can communicate through the diffusion of chemicals.

“We wanted to understand in what mode each of these (and others) would be best in terms of energy cost per bit,” Machta said. “In all our calculations, we consider information that is sent through a physical channel, from a physical sender of information (such as a ‘transmitting’ ion channel that opens and closes to send a signal) to a receiver (a membrane voltage detector). which can also be an ion channel). The core of the calculation is the textbook calculation of the information rate via a Gaussian channel, but with a few new twists.”

First, in their estimations, Machta and his colleagues always consider a physical channel in which currents of physical particles and electrical charges are transmitted according to the physics of the cell. Second, the team had always assumed that the channel was damaged by thermal noise in the cellular environment.

“We can calculate the spectrum of this noise using the ‘fluctuation dispersion theorem,’ which relates the spectrum of thermal fluctuations to near-equilibrium response functions,” explained Machta.

Another unique feature of the team estimates is that they were conducted using relatively simple models. This allowed the researchers to always place conservative lower bounds on the energy required to power the channel and control the physical currents in the biological system.

“Since the signal must overcome the thermal noise, we usually find the cost with a geometric prefactor multiplying “KBT/bit,’” Machta said.

“This geometric factor can be the size of the transmitter and receiver; a large transmitter generally reduces cost per bit by allowing the dissipative current to spread over a larger area. Additionally, a larger receiver allows for greater averaging of temperature fluctuations, so a weaker overall signal can still carry the same information.”

“So, for example, for electrical signaling, we get a form for cost per bit that varies as r2I σO toBT/bit, where r is the distance between transmitter and receiver and σIO are the sender and receiver sizes. Importantly, for ion channels that are a few nanometers in diameter but send information across microns, this cost could easily be many orders of magnitude higher than kT/bit which simpler (or more fundamental) arguments suggest as a lower bound.”

Overall, the calculations performed by Machta and his colleagues confirm the high energy costs associated with the transfer of information between cells. Ultimately, their estimates could be the beginning of an explanation for the high information processing costs measured in experimental studies.

“Our explanation is less ‘fundamental’ than the Landauer coupling in that it depends on the geometry of neurons and ion channels and other details,” Machta said. “However, if biology is subject to these details, then it may be that (for example) neurons are efficient and face real information/energy constraints, and not just inefficient. These calculations are certainly not enough to say that any particular system is efficient , but they suggest that sending information through space may require very high energy costs.”

In the future, this recent work by Machta and his colleagues could lead to interesting new biological studies. In their paper, the researchers also presented a “phase diagram” that represents situations in which the selective use of specific communication strategies (eg, electrical signaling, chemical diffusion, etc.) is optimal.

This diagram could soon help to better understand the design principles of different cell signaling strategies. For example, it could explain why neurons use chemical diffusion to communicate at synapses, but use electrical signals to send information across hundreds of microns from dendrites to the cell body; and also why E. coli bacteria use diffusion to send information about their chemical environment.

“One thing we’re working on now is to try to apply this framework to understanding the energetics of a particular signal transduction system,” Machta added.

“Our recent work only looked at the abstract cost of sending information between two separate components – in real systems there are typically information processing networks, and using our link requires understanding the flow of information in these networks. This goal also comes with new technical challenges – using our calcns to particular geometries (such as a ‘spherical’ neuron or an axon that resembles a tube, each important different from the infinite plane we used here).

More information:
Samuel J. Bryant et al., Physical Constraints in Intracellular Signaling: The Cost of Sending a Bit, Physical Review Letters (2023). DOI: 10.1103/PhysRevLett.131.068401

Information from the diary:
Physical Review Letters

Leave a Reply

Your email address will not be published. Required fields are marked *