If you find Complexity Thoughts interesting, follow us! Click on the Like button, leave a comment, repost on Substack or share this post. It is the only feedback I can have for this free service. The frequency and quality of this newsletter relies on social interactions. Thank you!
→ Don’t miss the podcast version of this post: click on “Our Podcast” above!
Evolution
Inducing novel endosymbioses by implanting bacteria in fungi
This is so interesting! Endosymbiosis is a process in which one organism lives inside the cells of another in a mutually beneficial relationship, leading to highly integrated partnerships over evolutionary time. This is distinct from holosymbiosis (that we have seen in a previous Issue), a broader concept describing interdependent relationships where multiple organisms coexist as a single, cohesive unit without necessarily residing within each other’s cells. In holosymbiosis, each organism maintains its individuality while contributing to a functional collective, as seen in coral reefs composed of coral polyps and symbiotic algae.
Lynn Margulis popularized the endosymbiotic theory in the 1960s, proposing that organelles like mitochondria and chloroplasts originated from once free-living bacteria that ancient eukaryotic cells engulfed. These bacteria, through a process of integration and coevolution, became essential organelles, driving the evolution of complex eukaryotic cells (a major evolution transition!).
Endosymbioses have profoundly impacted the evolution of life and continue to shape the ecology of a wide range of species. They give rise to new combinations of biochemical capabilities that promote innovation and diversification1,2. Despite the many examples of known endosymbioses across the tree of life, their de novo emergence is rare and challenging to uncover in retrospect3,4,5. Here we implant bacteria into the filamentous fungus Rhizopus microsporus to follow the fate of artificially induced endosymbioses. Whereas Escherichia coli implanted into the cytosol induced septum formation, effectively halting endosymbiogenesis, Mycetohabitans rhizoxinica was transmitted vertically to the progeny at a low frequency. Continuous positive selection on endosymbiosis mitigated initial fitness constraints by several orders of magnitude upon adaptive evolution. Phenotypic changes were underscored by the accumulation of mutations in the host as the system stabilized. The bacterium produced rhizoxin congeners in its new host, demonstrating the transfer of a metabolic function through induced endosymbiosis. Single-cell implantation thus provides a powerful experimental approach to study critical events at the onset of endosymbiogenesis and opens opportunities for synthetic approaches towards designing endosymbioses with desired traits.
Biological Systems
Correlations reveal the hierarchical organization of biological networks with latent variables
Deciphering the functional organization of large biological networks is a major challenge for current mathematical methods. A common approach is to decompose networks into largely independent functional modules, but inferring these modules and their organization from network activity is difficult, given the uncertainties and incompleteness of measurements. Typically, some parts of the overall functional organization, such as intermediate processing steps, are latent. We show that the hidden structure can be determined from the statistical moments of observable network components alone, as long as the functional relevance of the network components lies in their mean values and the mean of each latent variable maps onto a scaled expectation of a binary variable. Whether the function of biological networks permits a hierarchical modularization can be falsified by a correlation-based statistical test that we derive. We apply the test to gene regulatory networks, dendrites of pyramidal neurons, and networks of spiking neurons.
Neuroscience
What a month for neuroscience… a bunch of amazing papers appeared as a collection dedicated to the first complete connectome of the fruit-fly (140k neurons, 50M+ connections!) Check this News&Views for a deeper (while still short enough) summary.
My favorite? Honestly, I could not read all of them, but I was captured by the two strongly focused on networks:
Whole-brain annotation and multi-connectome cell typing of Drosophila
The fruit fly Drosophila melanogaster has emerged as a key model organism in neuroscience, in large part due to the concentration of collaboratively generated molecular, genetic and digital resources available for it. Here we complement the approximately 140,000 neuron FlyWire whole-brain connectome1 with a systematic and hierarchical annotation of neuronal classes, cell types and developmental units (hemilineages). Of 8,453 annotated cell types, 3,643 were previously proposed in the partial hemibrain connectome2, and 4,581 are new types, mostly from brain regions outside the hemibrain subvolume. Although nearly all hemibrain neurons could be matched morphologically in FlyWire, about one-third of cell types proposed for the hemibrain could not be reliably reidentified. We therefore propose a new definition of cell type as groups of cells that are each quantitatively more similar to cells in a different brain than to any other cell in the same brain, and we validate this definition through joint analysis of FlyWire and hemibrain connectomes. Further analysis defined simple heuristics for the reliability of connections between brains, revealed broad stereotypy and occasional variability in neuron count and connectivity, and provided evidence for functional homeostasis in the mushroom body through adjustments of the absolute amount of excitatory input while maintaining the excitation/inhibition ratio. Our work defines a consensus cell type atlas for the fly brain and provides both an intellectual framework and open-source toolchain for brain-scale comparative connectomics.
Network statistics of the whole-brain connectome of Drosophila
Brains comprise complex networks of neurons and connections, similar to the nodes and edges of artificial networks. Network analysis applied to the wiring diagrams of brains can offer insights into how they support computations and regulate the flow of information underlying perception and behaviour. The completion of the first whole-brain connectome of an adult fly, containing over 130,000 neurons and millions of synaptic connections1,2,3, offers an opportunity to analyse the statistical properties and topological features of a complete brain. Here we computed the prevalence of two- and three-node motifs, examined their strengths, related this information to both neurotransmitter composition and cell type annotations4,5, and compared these metrics with wiring diagrams of other animals. We found that the network of the fly brain displays rich-club organization, with a large population (30% of the connectome) of highly connected neurons. We identified subsets of rich-club neurons that may serve as integrators or broadcasters of signals. Finally, we examined subnetworks based on 78 anatomically defined brain regions or neuropils. These data products are shared within the FlyWire Codex (https://codex.flywire.ai) and should serve as a foundation for models and experiments exploring the relationship between neural activity and anatomical structure.
Signatures of criticality in efficient coding networks
To study information processing in the human brain and how it’s related to optimization principles, two main theories have been proposed: one explores the benefits of neural networks operating near criticality (see this perfect review), while the other one assumes they have evolved for optimal computation and efficient encoding of natural input (see this).
Instead of tuning the network around the critical point and evaluating its statistical information processing capabilities, we optimize a network to perform a clearly defined computation and investigate whether signatures of critical dynamics emerge in the optimized network.
The critical brain hypothesis states that the brain can benefit from operating close to a second-order phase transition. While it has been shown that several computational aspects of sensory processing (e.g., sensitivity to input) can be optimal in this regime, it is still unclear whether these computational benefits of criticality can be leveraged by neural systems performing behaviorally relevant computations. To address this question, we investigate signatures of criticality in networks optimized to perform efficient coding. We consider a spike-coding network of leaky integrate-and-fire neurons with synaptic transmission delays. Previously, it was shown that the performance of such networks varies nonmonotonically with the noise amplitude. Interestingly, we find that in the vicinity of the optimal noise level for efficient coding, the network dynamics exhibit some signatures of criticality, namely, scale-free dynamics of the spiking and the presence of crackling noise relation. Our work suggests that two influential, and previously disparate theories of neural processing optimization (efficient coding and criticality) may be intimately related.
Spontaneous Brain Activity Emerges from Pairwise Interactions in the Larval Zebrafish Brain
So, do we really need higher-order models for connectomes?
Brain activity is characterized by brainwide spatiotemporal patterns that emerge from synapse-mediated interactions between individual neurons. Calcium imaging provides access to in vivo recordings of whole-brain activity at single-neuron resolution and, therefore, allows the study of how large-scale brain dynamics emerge from local activity. In this study, we use a statistical mechanics approach—the pairwise maximum entropy model—to infer microscopic network features from collective patterns of activity in the larval zebrafish brain and relate these features to the emergence of observed whole-brain dynamics. Our findings indicate that the pairwise interactions between neural populations and their intrinsic activity states are sufficient to explain observed whole-brain dynamics. In fact, the pairwise relationships between neuronal populations estimated with the maximum entropy model strongly correspond to observed structural connectivity patterns. Model simulations also demonstrated how tuning pairwise neuronal interactions drives transitions between observed physiological regimes and pathologically hyperexcitable whole-brain regimes. Finally, we use virtual resection to identify the brain structures that are important for maintaining the brain in a physiological dynamic regime. Together, our results indicate that whole-brain activity emerges from a complex dynamical system that transitions between basins of attraction whose strength and topology depend on the connectivity between brain areas.
Weak pairwise correlations imply strongly correlated network states in a neural population
Biological networks have so many possible states that exhaustive sampling is impossible. Successful analysis thus depends on simplifying hypotheses, but experiments on many systems hint that complicated, higher-order interactions among large groups of elements have an important role. Here we show, in the vertebrate retina, that weak correlations between pairs of neurons coexist with strongly collective behaviour in the responses of ten or more neurons. We find that this collective behaviour is described quantitatively by models that capture the observed pairwise correlations but assume no higher-order interactions. These maximum entropy models are equivalent to Ising models, and predict that larger networks are completely dominated by correlation effects. This suggests that the neural code has associative or error-correcting properties, and we provide preliminary evidence for such behaviour. As a first test for the generality of these ideas, we show that similar results are obtained from networks of cultured cortical neurons.
Griffiths phases and the stretching of criticality in brain networks
Hallmarks of criticality, such as power-laws and scale invariance, have been empirically found in cortical-network dynamics and it has been conjectured that operating at criticality entails functional advantages, such as optimal computational capabilities, memory and large dynamical ranges. As critical behaviour requires a high degree of fine tuning to emerge, some type of self-tuning mechanism needs to be invoked. Here we show that, taking into account the complex hierarchical-modular architecture of cortical networks, the singular critical point is replaced by an extended critical-like region that corresponds—in the jargon of statistical mechanics—to a Griffiths phase. Using computational and analytical approaches, we find Griffiths phases in synthetic hierarchical networks and also in empirical brain networks such as the human connectome and that of Caenorhabditis elegans. Stretched critical regions, stemming from structural disorder, yield enhanced functionality in a generic way, facilitating the task of self-organizing, adaptive and evolutionary mechanisms selecting for criticality.
Multiscale organization of neuronal activity unifies scale-dependent theories of brain function
Brain recordings collected at different resolutions support distinct signatures of neural coding, leading to scale-dependent theories of brain function. Here, we show that these disparate signatures emerge from a heavy-tailed, multiscale functional organization of neuronal activity observed across calcium-imaging recordings collected from the whole brains of zebrafish and C. elegans as well as from sensory regions in Drosophila, mice, and macaques. Network simulations demonstrate that this conserved hierarchical structure enhances information processing. Finally, we find that this organization is maintained despite significant cross-scale reconfiguration of cellular coordination during behavior. Our findings suggest that this nonlinear organization of neuronal activity is a universal principle conserved for its ability to adaptively link behavior to neural dynamics across multiple spatiotemporal scales while balancing functional resiliency and information processing efficiency.