Partner: Ehud Kaplan

Icahn School of Medicine at Mount Sinai (US)

Recent publications
1.Pręgowska A., Kaplan E., Szczepański J., How far can neural correlations reduce uncertainty? Comparison of information transmission rates for Markov and Bernoulli processes, International Journal of Neural Systems, ISSN: 0129-0657, DOI: 10.1142/S0129065719500035, Vol.29, No.8, pp.1950003-1-13, 2019
Abstract:

The nature of neural codes is central to neuroscience. Do neurons encode information through relatively slow changes in the firing rates of individual spikes (rate code) or by the precise timing of every spike (temporal code)? Here we compare the loss of information due to correlations for these two possible neural codes. The essence of Shannon's definition of information is to combine information with uncertainty: the higher the uncertainty of a given event, the more information is conveyed by that event. Correlations can reduce uncertainty or the amount of information, but by how much? In this paper we address this question by a direct comparison of the information per symbol conveyed by the words coming from a binary Markov source (temporal code) with the information per symbol coming from the corresponding Bernoulli source (uncorrelated, rate code). In a previous paper we found that a crucial role in the relation between information transmission rates (ITRs) and firing rates is played by a parameter s, which is the sum of transition probabilities from the no-spike state to the spike state and vice versa. We found that in this case too a crucial role is played by the same parameter s. We calculated the maximal and minimal bounds of the quotient of ITRs for these sources. Next, making use of the entropy grouping axiom, we determined the loss of information in a Markov source compared with the information in the corresponding Bernoulli source for a given word length. Our results show that in the case of correlated signals the loss of information is relatively small, and thus temporal codes, which are more energetically efficient, can replace rate codes effectively. These results were confirmed by experiments.

Keywords:

Shannon information theory, information source, information transmission rate, firing rate, neural coding

Affiliations:
Pręgowska A.-IPPT PAN
Kaplan E.-Icahn School of Medicine at Mount Sinai (US)
Szczepański J.-IPPT PAN
2.Pręgowska A., Casti A., Kaplan E., Wajnryb E., Szczepański J., Information processing in the LGN: a comparison of neural codes and cell types, BIOLOGICAL CYBERNETICS, ISSN: 0340-1200, DOI: 10.1007/s00422-019-00801-0, Vol.113, No.4, pp.453-464, 2019
Abstract:

To understand how anatomy and physiology allow an organism to perform its function, it is important to know how information that is transmitted by spikes in the brain is received and encoded. A natural question is whether the spike rate alone encodes the information about a stimulus (rate code), or additional information is contained in the temporal pattern of the spikes (temporal code). Here we address this question using data from the cat Lateral Geniculate Nucleus (LGN), which is the visual portion of the thalamus, through which visual information from the retina is communicated to the visual cortex. We analyzed the responses of LGN neurons to spatially homogeneous spots of various sizes with temporally random luminance modulation. We compared the Firing Rate with the Shannon Information Transmission Rate, which quantifies the information contained in the temporal relationships between spikes. We found that the behavior of these two rates can differ quantitatively. This suggests that the energy used for spiking does not translate directly into the information to be transmitted. We also compared Firing Rates with Information Rates for X-ON and X-OFF cells. We found that, for X-ON cells the Firing Rate and Information Rate often behave in a completely different way, while for X-OFF cells these rates are much more highly correlated. Our results suggest that for X-ON cells a more efficient "temporal code" is employed, while for X-OFF cells a straightforward "rate code" is used, which is more reliable and is correlated with energy consumption.

Keywords:

Shannon information theory, cat LGN, ON–OFF cells, neural coding, entropy, firing rate

Affiliations:
Pręgowska A.-IPPT PAN
Casti A.-Fairleigh Dickinson University (US)
Kaplan E.-Icahn School of Medicine at Mount Sinai (US)
Wajnryb E.-IPPT PAN
Szczepański J.-IPPT PAN