Brain dynamics show a rich spatiotemporal behavior whose stability is neither ordered nor chaotic, indicating that neural networks operate at intermediate stability regimes including critical dynamics represented by a negative power-law distribution of avalanche sizes with exponent
. However, it is unknown which stability regimen allows global and local information transmission with reduced metabolic costs, which are measured in terms of synaptic potentials and action potentials. In this work, using a hierarchical neuron model with rich-club organization, we measure the average number of action potentials required to activate n different neurons (avalanche size). Besides, we develop a mathematical formula to represent the metabolic synaptic potential cost. We develop simulations variating the synaptic amplitude, synaptic time course (ms), and hub excitatory/inhibitory ratio. We compare different dynamic regimes in terms of avalanche sizes vs. metabolic cost. We also implement the dynamic model in a Drosophila and Erdos–Renyi networks to computer dynamics and metabolic costs. The results show that the synaptic amplitude and time course play a key role in information propagation. They can drive the system from subcritical to supercritical regimes. The later result promotes the coexistence of critical regimes with a wide range of excitation/inhibition hub ratios. Moreover, subcritical or silent regimes minimize metabolic cost for local avalanche sizes, whereas critical and intermediate stability regimes show the best compromise between information propagation and reduced metabolic consumption, also minimizing metabolic cost for a wide range of avalanche sizes.
This is an open access article distributed under the Creative Commons Attribution License
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited