If you find Complexity Thoughts interesting, follow us! Click on the Like button, leave a comment, repost on Substack or share this post. It is the only feedback I can have for this free service. The frequency and quality of this newsletter relies on social interactions. Thank you!
Dall-e 3 representation of this issue’s content
Foundations of network science and complex systems
Heating and cooling are fundamentally asymmetric and evolve along distinct pathways
According to conventional wisdom, a system placed in an environment with a different temperature tends to relax to the temperature of the latter, mediated by the flows of heat or matter that are set solely by the temperature difference. It is becoming clear, however, that thermal relaxation is much more intricate when temperature changes push the system far from thermodynamic equilibrium. Here, by using an optically trapped colloidal particle, we show that microscale systems under such conditions heat up faster than they cool down. We find that between any pair of temperatures, heating is not only faster than cooling but the respective processes, in fact, evolve along fundamentally distinct pathways, which we explain with a new theoretical framework that we call thermal kinematics. Our results change the view of thermalization at the microscale and will have a strong impact on energy-conversion applications and thermal management of microscopic devices, particularly in the operation of Brownian heat engines.
Compressing the Chronology of a Temporal Network with Graph Commutators
Temporal networks are notoriously difficult to work with. Among the main issues related to their analysis there is the choice of how to compress the system without altering the information content of the data.
In the last years we have approached this problem for multilayer networks (see a recent review about this), from both structural and functional perspectives:
In this paper, the authors adopt an interesting approach, also based on information theory and hierarchical clustering, to compress temporal networks while remaining faithful to epidemic dynamics:
Studies of dynamics on temporal networks often represent the network as a series of “snapshots,” static networks active for short durations of time. We argue that successive snapshots can be aggregated if doing so has little effect on the overlying dynamics. We propose a method to compress network chronologies by progressively combining pairs of snapshots whose matrix commutators have the smallest dynamical effect. We apply this method to epidemic modeling on real contact tracing data and find that it allows for significant compression while remaining faithful to the epidemic dynamics.
Learning low-rank latent mesoscale structures in networks
Researchers in many fields use networks to represent interactions between entities in complex systems. To study the large-scale behavior of complex systems, it is useful to examine mesoscale structures in networks as building blocks that influence such behavior. In this paper, we present an approach to describe low-rank mesoscale structures in networks. We find that many real-world networks possess a small set of latent motifs that effectively approximate most subgraphs at a fixed mesoscale. Such low-rank mesoscale structures allow one to reconstruct networks by approximating subgraphs of a network using combinations of latent motifs. Employing subgraph sampling and nonnegative matrix factorization enables the discovery of these latent motifs. The ability to encode and reconstruct networks using a small set of latent motifs has many applications in network analysis, including network comparison, network denoising, and edge inference.
Number of Attractors in the Critical Kauffman Model Is Exponential
The Kauffman model is the archetypal model of genetic computation. It highlights the importance of criticality, at which many biological systems seem poised. In a series of advances, researchers have honed in on how the number of attractors in the critical regime grows with network size. But a definitive answer has remained elusive. We prove that, for the critical Kauffman model with connectivity one, the number of attractors grows at least, and at most, as (2/√e)N. This is the first proof that the number of attractors in a critical Kauffman model grows exponentially.
Ecosystems
Structured community transitions explain the switching capacity of microbial systems
Microbial systems appear to exhibit a relatively high switching capacity of moving back and forth among few dominant communities (taxon memberships). While this switching behavior has been mainly attributed to random environmental factors, it remains unclear the extent to which internal community dynamics affect the switching capacity of microbial systems. Here, we integrate ecological theory and empirical data to demonstrate that structured community transitions increase the dependency of future communities on the current taxon membership, enhancing the switching capacity of microbial systems. Following a structuralist approach, we propose that each community is feasible within a unique domain in environmental parameter space. Then, structured transitions between any two communities can happen with probability proportional to the size of their feasibility domains and inversely proportional to their distance in environmental parameter space—which can be treated as a special case of the gravity model. We detect two broad classes of systems with structured transitions: one class where switching capacity is high across a wide range of community sizes and another class where switching capacity is high only inside a narrow size range. We corroborate our theory using temporal data of gut and oral microbiota (belonging to class 1) as well as vaginal and ocean microbiota (belonging to class 2). These results reveal that the topology of feasibility domains in environmental parameter space is a relevant property to understand the changing behavior of microbial systems. This knowledge can be potentially used to understand the relevant community size at which internal dynamics can be operating in microbial systems.
Epidemics
Nonlinear bias toward complex contagion in uncertain transmission settings
As we gather evidence about the explanatory power of complex contagions, we must be careful and consider the subtle but important role heterogeneity can play in shaping the rate of infection
Current epidemics in the biological and social domains are challenging the standard assumptions of mathematical contagion models. Chief among them are the complex patterns of transmission caused by heterogeneous group sizes and infection risk varying by orders of magnitude in different settings, like indoor versus outdoor gatherings in the COVID-19 pandemic or different moderation practices in social media communities. However, quantifying these heterogeneous levels of risk is difficult, and most models typically ignore them. Here, we include these features in an epidemic model on weighted hypergraphs to capture group-specific transmission rates. We study analytically the consequences of ignoring the heterogeneous transmissibility and find an induced superlinear infection rate during the emergence of a new outbreak, even though the underlying mechanism is a simple, linear contagion. The dynamics produced at the individual and group levels are therefore more similar to complex, nonlinear contagions, thus blurring the line between simple and complex contagions in realistic settings. We support this claim by introducing a Bayesian inference framework to quantify the nonlinearity of contagion processes. We show that simple contagions on real weighted hypergraphs are systematically biased toward the superlinear regime if the heterogeneity of the weights is ignored, greatly increasing the risk of erroneous classification as complex contagions. Our results provide an important cautionary tale for the challenging task of inferring transmission mechanisms from incidence data. Yet, it also paves the way for effective models that capture complex features of epidemics through nonlinear infection rates.
Neuroscience
One of the central questions in neuroscience is how particular tasks, or computations, are implemented by neural networks to generate behavior. The prevailing view has been that information processing in neural networks results primarily from the properties of synapses and the connectivity of neurons within the network, with the intrinsic excitability of single neurons playing a lesser role. As a consequence, the contribution of single neurons to computation in the brain has long been underestimated. Here we review recent work showing that neuronal dendrites exhibit a range of linear and nonlinear mechanisms that allow them to implement elementary computations. We discuss why these dendritic properties may be essential for the computations performed by the neuron and the network and provide theoretical and experimental examples to support this view.
Neuronal ensembles: Building blocks of neural circuits
Neuronal ensembles, defined as groups of neurons displaying recurring patterns of coordinated activity, represent an intermediate functional level between individual neurons and brain areas. Novel methods to measure and optically manipulate the activity of neuronal populations have provided evidence of ensembles in the neocortex and hippocampus. Ensembles can be activated intrinsically or in response to sensory stimuli and play a causal role in perception and behavior. Here we review ensemble phenomenology, developmental origin, biophysical and synaptic mechanisms, and potential functional roles across different brain areas and species, including humans. As modular units of neural circuits, ensembles could provide a mechanistic underpinning of fundamental brain processes, including neural coding, motor planning, decision-making, learning, and adaptability.
Diffusion MRI Connections in the Octopus Brain
Using high angle resolution diffusion magnetic resonance imaging (HARDI) with fiber tractography analysis we map out a meso-scale connectome of the Octopus bimaculoides brain. The brain of this cephalopod has a qualitatively different organization than that of vertebrates, yet it exhibits complex behavior, an elaborate sensory system and high cognitive abilities. Over the last 60 years wide ranging and detailed studies of octopus brain anatomy have been undertaken, including classical histological sectioning/staining, electron microscopy and neuronal tract tracing with injected dyes. These studies have elucidated many neuronal connections within and among anatomical structures. Diffusion MRI based tractography utilizes a qualitatively different method of tracing connections within the brain and offers facile three-dimensional images of anatomy and connections that can be quantitatively analyzed. Twenty-five separate lobes of the brain were segmented in the 3D MR images of each of three samples, including all five sub-structures in the vertical lobe. These parcellations were used to assay fiber tracings between lobes. The connectivity matrix constructed from diffusion MRI data was largely in agreement with that assembled from earlier studies. The one major difference was that connections between the vertical lobe and more basal supra-esophageal structures present in the literature were not found by MRI. In all, 92 connections between the 25 different lobes were noted by diffusion MRI: 53 between supra-esophageal lobes and 26 between the optic lobes and other structures. These represent the beginnings of a mesoscale connectome of the octopus brain.
Bio-inspired computing
Mosaic: in-memory computing and routing for small-world spike-based neuromorphic systems
The brain’s connectivity is locally dense and globally sparse, forming a small-world graph—a principle prevalent in the evolution of various species, suggesting a universal solution for efficient information routing. However, current artificial neural network circuit architectures do not fully embrace small-world neural network models. Here, we present the neuromorphic Mosaic: a non-von Neumann systolic architecture employing distributed memristors for in-memory computing and in-memory routing, efficiently implementing small-world graph topologies for Spiking Neural Networks (SNNs). We’ve designed, fabricated, and experimentally demonstrated the Mosaic’s building blocks, using integrated memristors with 130 nm CMOS technology. We show that thanks to enforcing locality in the connectivity, routing efficiency of Mosaic is at least one order of magnitude higher than other SNN hardware platforms. This is while Mosaic achieves a competitive accuracy in a variety of edge benchmarks. Mosaic offers a scalable approach for edge systems based on distributed spike-based computing and in-memory routing.
Inferring neural activity before plasticity as a foundation for learning beyond backpropagation
For both humans and machines, the essence of learning is to pinpoint which components in its information processing pipeline are responsible for an error in its output, a challenge that is known as ‘credit assignment’. It has long been assumed that credit assignment is best solved by backpropagation, which is also the foundation of modern machine learning. Here, we set out a fundamentally different principle on credit assignment called ‘prospective configuration’. In prospective configuration, the network first infers the pattern of neural activity that should result from learning, and then the synaptic weights are modified to consolidate the change in neural activity. We demonstrate that this distinct mechanism, in contrast to backpropagation, (1) underlies learning in a well-established family of models of cortical circuits, (2) enables learning that is more efficient and effective in many contexts faced by biological organisms and (3) reproduces surprising patterns of neural activity and behavior observed in diverse human and rat learning experiments.