Partner: Bartosz Paprocki |
|
Doctoral thesis
2015-12-04 | Analiza wydajności transmisji danych w komórkach i sieciach neuronowych metodami Teorii Informacji (UKW)
| 668 |
Recent publications
1. | Paprocki B.♦, Pręgowska A., Szczepański J., Does Adding of Neurons to the Network Layer Lead to Increased Transmission Efficiency?, IEEE Access, ISSN: 2169-3536, DOI: 10.1109/ACCESS.2024.3379324, Vol.12, pp. 42701-42709, 2024 Abstract: The aim of this study is to contribute to the important question in Neuroscience of whether the number of neurons in a given layer of a network affects transmission efficiency. Mutual Information, as defined by Shannon, between the input and output signals for certain classes of networks is analyzed theoretically and numerically. A Levy-Baxter probabilistic neural model is applied. This model includes all important qualitative mechanisms involved in the transmission process in the brain. We derived analytical formulas for the Mutual Information of input signals coming from Information Sources as Bernoulli processes. These formulas depend on the parameters of the Information Source, neurons and network. Numerical simulations were performed using these equations. It turned out, that the Mutual Information starting from a certain value increased very slowly with the number of neurons being added. The increase is of the rate m_{−c} where m is the number of neurons in the transmission layer, and c is very small. The calculations also show that for a practical number (up to 15000) of neurons, the Mutual Information reaches only approximately half of the information that is carried out by the input signal. The influence of noise on the transmission efficiency depending on the number of neurons was also analyzed. It turned out that the noise level at which transmission is optimal increases significantly with this number. Our results indicate that a large number of neurons in the network does not mean an essential improvement in transmission efficiency, but can contribute to reliability. Keywords:Shannon communication theory,neural network,network layer,transmission efficiency,mutual information,model of neuron,spike trains,information source,entropy Affiliations:
| ||||||||||
2. | Paprocki B.♦, Pręgowska A., Szczepański J., Optimizing information processing in brain-inspired neural networks, BULLETIN OF THE POLISH ACADEMY OF SCIENCES: TECHNICAL SCIENCES, ISSN: 0239-7528, DOI: 10.24425/bpasts.2020.131844, Vol.68, No.2, pp.225-233, 2020 Abstract: The way brain networks maintain high transmission efficiency is believed to be fundamental in understanding brain activity. Brains consisting of more cells render information transmission more reliable and robust to noise. On the other hand, processing information in larger networks requires additional energy. Recent studies suggest that it is complexity, connectivity, and function diversity, rather than just size and the number of neurons, that could favour the evolution of memory, learning, and higher cognition. In this paper, we use Shannon information theory to address transmission efficiency quantitatively. We describe neural networks as communication channels, and then we measure information as mutual information between stimuli and network responses. We employ a probabilistic neuron model based on the approach proposed by Levy and Baxter, which comprises essential qualitative information transfer mechanisms. In this paper, we overview and discuss our previous quantitative results regarding brain-inspired networks, addressing their qualitative consequences in the context of broader literature. It is shown that mutual information is often maximized in a very noisy environment e.g., where only one-third of all input spikes are allowed to pass through noisy synapses and farther into the network. Moreover, we show that inhibitory connections as well as properly displaced long-range connections often significantly improve transmission efficiency. A deep understanding of brain processes in terms of advanced mathematical science plays an important role in the explanation of the nature of brain efficiency. Our results confirm that basic brain components that appear during the evolution process arise to optimise transmission performance. Keywords:neural network, entropy, mutual information, noise, inhibitory neuron Affiliations:
| ||||||||||
3. | Paprocki B.♦, Szczepański J., How do the amplitude fluctuations affect the neuronal transmission efficiency, NEUROCOMPUTING, ISSN: 0925-2312, DOI: 10.1016/j.neucom.2012.11.001, Vol.104, pp.50-56, 2013 Abstract: Tremendous effort is done to understand the nature of neuronal coding, its high efficiency and mechanisms governing it. Paprocki and Szczepanski [13] explored the model of neuron proposed by Levy and Baxter [12] and analyzed the efficiency with respect to the synaptic failure, activation threshold, firing rate and type of the input source. In this paper we study influence of the amplitude fluctuations (damping, uniform and amplifying), another important component in neuronal computations. The efficiency is understood in the sense of mutual information (between input and output signals), which constitutes the fundamental concept of the Shannon communication theory. Using high quality entropy estimators we determined maximal values of the mutual information between input and output neuronal signals for simple neuronal architecture. We observed that this maximal efficiency remains nearly constant, almost regardless of the fluctuation type. We have also found that for wide range of thresholds, both for damping and amplifying fluctuations, the mutual information behaves in an opposite way to the corresponding correlations between input and output signals. These calculations confirm that the neuronal coding is much more subtle than the straightforward intuitive optimization of input–output correlations. Keywords:Neural computation, Mutual information, Amplitude fluctuation, Activation threshold, Synaptic failure, Entropy estimation Affiliations:
| ||||||||||
4. | Paprocki B.♦, Szczepański J., Transmission efficiency in ring, brain inspired neuronal networks. Informationand energetic aspects, Brain Research, ISSN: 0006-8993, DOI: 10.1016/j.brainres.2013.07.024, Vol.1536, pp.135-143, 2013 Abstract: Organisms often evolve as compromises, and many of these compromises can be expressed in terms of energy efficiency. Thus, many authors analyze energetic costs processes during information transmission in the brain. In this paper we study information transmission rate per energy used in a class of ring, brain inspired neural networks, which we assume to involve components like excitatory and inhibitory neurons or long-range connections. Choosing model of neuron we followed a probabilistic approach proposed by Levy and Baxter (2002), which contains all essential qualitative mechanisms participating in the transmission process and provides results consistent with physiologically observed values. Information transmission efficiency, Mutual information, Brain inspired network, Inhibitory neuron, Long-range connection, Neuronal computation Affiliations:
| ||||||||||
5. | Paprocki B.♦, Szczepański J., Kołbuk D., Information transmission efficiency in neuronal communication systems, BMC NEUROSCIENCE, ISSN: 1471-2202, DOI: 10.1186/1471-2202-14-S1-P217, Vol.14(Suppl 1), No.P217, pp.1-2, 2013 Abstract: The nature and efficiency of brain transmission pro-cesses, its high reliability and efficiency is one of the most elusive area of contemporary science [1]. We study information transmission efficiency by considering a neuronal communication as a Shannon-type channel. Thus, using high quality entropy estimators, we evaluate the mutual information between input and output signals. We assume model of neuron proposed by Levy and Baxter [2], which incorporates all essential qualitative mechanisms participating in neural transmission process. Keywords:transmission efficiency, neuronal communication, Shannon-type channe Affiliations:
| ||||||||||
6. | Paprocki B.♦, Szczepański J., Efficiency of neural transmission as a function of synaptic noise, threshold, and source characteristics, BIOSYSTEMS, ISSN: 0303-2647, DOI: 10.1016/j.biosystems.2011.03.005, Vol.105, pp.62-72, 2011 Abstract: There has been a growing interest in the estimation of information carried by a single neuron and multiple single units or population of neurons to specific stimuli. In this paper we analyze, inspired by article of Levy and Baxter (2002), the efficiency of a neuronal communication by considering dendrosomatic summation as a Shannon-type channel (1948) and by considering such uncertain synaptic transmission as part of the dendrosomatic computation. Specifically, we study Mutual Information between input and output signals for different types of neuronal network architectures by applying efficient entropy estimators. We analyze the influence of the following quantities affecting transmission abilities of neurons: synaptic failure, activation threshold, firing rate and type of the input source. We observed a number of surprising non-intuitive effects. It turns out that, especially for lower activation thresholds, significant synaptic noise can lead even to twofold increase of the transmission efficiency. Moreover, the efficiency turns out to be a non-monotonic function of the activation threshold. We find a universal value of threshold for which a local maximum of Mutual Information is achieved for most of the neuronal architectures, regardless of the type of the source (correlated and non-correlated). Additionally, to reach the global maximum the optimal firing rates must increase with the threshold. This effect is particularly visible for lower firing rates. For higher firing rates the influence of synaptic noise on the transmission efficiency is more advantageous. Noise is an inherent component of communication in biological systems, hence, based on our analysis, we conjecture that the neuronal architecture was adjusted to make more effective use of this attribute. Keywords:Neuronal computation, Entropy, Mutual Information, Estimators, Neuron, Quantal failure, Activation threshold, Neural network Affiliations:
|
Conference abstracts
1. | Paprocki B.♦, Pręgowska A., Szczepański J., Information Processing in Brain-Inspired Networks: Size and Density Effects, SolMech 2016, 40th Solid Mechanics Conference, 2016-08-29/09-02, Warszawa (PL), No.P192, pp.1-2, 2016 | |||||||
2. | Szczepański J., Sanchez-Vives M.V.♦, Arnold M.M.♦, Montejo N.♦, Paprocki B.♦, Pręgowska A., Amigó J.M.♦, Wajnryb E., Analyzing Neuroscience Signals using Information Theory and Complexity Shannon Communication Approach, 12th INCF, 12th INCF Workshop on Node Communication and Collaborative Neuroinformatics, 2015-04-16/04-17, Warszawa (PL), pp.1-32, 2015 | |||||||
3. | Szczepański J., Paprocki B.♦, Transmission efficiency in the brain-like neuronal networks. Information and energetic aspects, 10th International Neural Coding Workshop, 2012-09-02/09-08, Prague (CZ), pp.127-128, 2012 Keywords: Neuronal Communication, Brain-like Network, Shannon Theory Affiliations:
| |||||||
4. | Paprocki B.♦, Szczepański J., Effectiveness of information transmission in the brain-like communication models, 10th International Neural Coding Workshop, 2012-09-02/09-08, Prague (CZ), pp.93-94, 2012 Keywords: Brain-like network, Information transmission, Neuronal computation Affiliations:
|