Partner: Paweł Czyż

University of Oxford (GB)

Recent publications
1.Grabowski F., Czyż P., Kochańczyk M., Lipniacki T., Limits to the rate of information transmission through the MAPK pathway, JOURNAL OF THE ROYAL SOCIETY INTERFACE, ISSN: 1742-5689, DOI: 10.1098/rsif.2018.0792, Vol.16, No.152, pp.20180792-1-10, 2019
Abstract:

Two important signalling pathways of NF-κB and ERK transmit merely 1 bit of information about the level of extracellular stimulation. It is thus unclear how such systems can coordinate complex cell responses to external cues. We analyse information transmission in the MAPK/ERK pathway that converts both constant and pulsatile EGF stimulation into pulses of ERK activity. Based on an experimentally verified computational model, we demonstrate that, when input consists of sequences of EGF pulses, transmitted information increases nearly linearly with time. Thus, pulse-interval transcoding allows more information to be relayed than the amplitude–amplitude transcoding considered previously for the ERK and NF-κB pathways. Moreover, the information channel capacity C, or simply bitrate, is not limited by the bandwidth B = 1/τ, where τ ≈ 1 h is the relaxation time. Specifically, when the input is provided in the form of sequences of short binary EGF pulses separated by intervals that are multiples of τ/n (but not shorter than τ), then for n = 2, C ≈ 1.39 bit/h^-1; and for n = 4, C ≈ 1.86 bit/h^-1. The capability to respond to random sequences of EGF pulses enables cells to propagate spontaneous ERK activity waves across tissue.

Keywords:

cellular signal transduction, pulsatile stimulation, pulse-interval transcoding, bandwidth, representation problem

Affiliations:
Grabowski F.-other affiliation
Czyż P.-University of Oxford (GB)
Kochańczyk M.-IPPT PAN
Lipniacki T.-IPPT PAN

Conference papers
1.Czyż P., Grabowski F., Vogt J., Beerenwinkel N., Marx A., Beyond Normal: On the Evaluation of Mutual Information Estimators, NeurIPS 2023, Advances in Neural Information Processing Systems, 2023-12-10/12-16, New Orleans (US), pp.1-34, 2023
Abstract:

Mutual information is a general statistical dependency measure which has found applications in representation learning, causality, domain generalization and computational biology. However, mutual information estimators are typically evaluated on simple families of probability distributions, namely multivariate normal distribution and selected distributions with one-dimensional random variables. In this paper, we show how to construct a diverse family of distributions with known ground-truth mutual information and propose a language-independent benchmarking platform for mutual information estimators. We discuss the general applicability and limitations of classical and neural estimators in settings involving high dimensions, sparse interactions, long-tailed distributions, and high mutual information. Finally, we provide guidelines for practitioners on how to select appropriate estimator adapted to the difficulty of problem considered and issues one needs to consider when applying an estimator to a new data set.

Affiliations:
Czyż P.-University of Oxford (GB)
Grabowski F.-IPPT PAN
Vogt J.-other affiliation
Beerenwinkel N.-other affiliation
Marx A.-other affiliation