If you find Complexity Thoughts interesting, follow us! Click on the Like button, leave a comment, repost on Substack or share this post. It is the only feedback I can have for this free service. The frequency and quality of this newsletter relies on social interactions. Thank you!
→ Don’t miss the podcast version of this post: click on “Our Podcast” above!
Foundations of network science and complex systems
New Universality Class Describes Vicsek’s Flocking Phase in Physical Dimensions
The Vicsek model is one of the most famous models to explain the emergence of collective behavior in active matter. It illustrates how local alignment rules among self-propelled particles can lead to collective movement patterns, similar to phenomena observed in flocks of birds, schools of fish, bacterial colonies, and many others. Each particle in the model updates its position and direction based on interactions within a fixed neighborhood radius, exhibiting emergent order:
A paper by Toner and Tu appeared in the same year has shed light on the universality class of this process. Both papers provide the right background for the following one.
The Vicsek simulation model of flocking together with its theoretical treatment by Toner and Tu in 1995 were two foundational cornerstones of active matter physics. However, despite the field’s tremendous progress, the actual universality class (UC) governing the scaling behavior of Viscek’s “flocking” phase remains elusive. Here, we use nonperturbative, functional renormalization group methods to analyze, numerically and analytically, a simplified version of the Toner-Tu model, and uncover a novel UC with scaling exponents that agree remarkably well with the values obtained in a recent simulation study by Mahault et al. [Phys. Rev. Lett. 123, 218001 (2019)], in both two and three spatial dimensions. We therefore believe that there is strong evidence that the UC uncovered here describes Vicsek’s flocking phase.
Searching for long timescales without fine tuning
Animal behavior occurs on timescales much longer than the response times of individual neurons. In many cases, it is plausible that these long timescales emerge from the recurrent dynamics of electrical activity in networks of neurons. In linear models, timescales are set by the eigenvalues of a dynamical matrix whose elements measure the strengths of synaptic connections between neurons. It is not clear to what extent these matrix elements need to be tuned to generate long timescales; in some cases, one needs not just a single long timescale but a whole range. Starting from the simplest case of random symmetric connections, we combine maximum entropy and random matrix theory methods to construct ensembles of networks, exploring the constraints required for long timescales to become generic. We argue that a single long timescale can emerge generically from realistic constraints, but a full spectrum of slow modes requires more tuning. Langevin dynamics that generates patterns of synaptic connections drawn from these ensembles involves a combination of Hebbian learning and activity-dependent synaptic scaling.
Theory of Coupled Neuronal-Synaptic Dynamics
Let us imagine a network of neurons, where their activity (neuronal dynamics) and their connections (synaptic dynamics) are constantly influencing each other. Yes, this is a beautiful example of adaptive network. The model is:
where the synaptic coupling dynamics is given by a sum of quenched (J) and fluctuating (A) terms:
and the dynamics of the fluctuations is Hebbian-like:
Usually, the neurons might settle into a predictable, stable state. However, when the synaptic connections are continuously changing through learning rules like Hebbian plasticity, they can create more chaotic and unpredictable patterns. Now, if we stop (ie, freeze) the synaptic changes at a certain moment what happens to the chaos?
Depending on the system, the authors find that three outcomes are possible:
in nonfreezable chaos the neurons return to their calm state
in semifreezable chaos, the chaos partially persists but changes into something less dynamic;
in freezable chaos, the chaos is perfectly preserved.
The last type is especially exciting because it suggests that neurons might encode not just stable memories, but dynamic, chaotic ones, opening up new ways to think about how brains might process and store complex information.
In neural circuits, synaptic strengths influence neuronal activity by shaping network dynamics, and neuronal activity influences synaptic strengths through activity-dependent plasticity. Motivated by this fact, we study a recurrent-network model in which neuronal units and synaptic couplings are interacting dynamic variables, with couplings subject to Hebbian modification with decay around quenched random strengths. Rather than assigning a specific role to the plasticity, we use dynamical mean-field theory and other techniques to systematically characterize the neuronal-synaptic dynamics, revealing a rich phase diagram. Adding Hebbian plasticity slows activity in already chaotic networks and can induce chaos in otherwise quiescent networks. Anti-Hebbian plasticity quickens activity and produces an oscillatory component. Analysis of the Jacobian shows that Hebbian and anti-Hebbian plasticity push locally unstable modes toward the real and imaginary axes, respectively, explaining these behaviors. Both random-matrix and Lyapunov analysis show that strong Hebbian plasticity segregates network timescales into two bands, with a slow, synapse-dominated band driving the dynamics, suggesting a flipped view of the network as synapses connected by neurons. For increasing strength, Hebbian plasticity initially raises the complexity of the dynamics, measured by the maximum Lyapunov exponent and attractor dimension, but then decreases these metrics, likely due to the proliferation of stable fixed points. We compute the marginally stable spectra of such fixed points as well as their number, showing exponential growth with network size. Finally, in chaotic states with strong Hebbian plasticity, a stable fixed point of neuronal dynamics is destabilized by synaptic dynamics, allowing any neuronal state to be stored as a stable fixed point by halting the plasticity. This phase of freezable chaos offers a new mechanism for working memory.
Ecosystems
Global ecosystems are rapidly approaching tipping points, where minute shifts can lead to drastic ecological changes. Theory predicts that evolution can shape a system’s tipping point behaviour, but direct experimental support is lacking. Here we investigate the power of evolutionary processes to alter these critical thresholds and protect an ecological community from collapse. To do this, we propagate a two-species microbial system composed of Escherichia coli and baker’s yeast, Saccharomyces cerevisiae, for over 4,000 generations, and map ecological stability before and after coevolution. Our results reveal that tipping points—and other geometric properties of ecological communities—can evolve to alter the range of conditions under which our microbial community can flourish. We develop a mathematical model to illustrate how evolutionary changes in parameters such as growth rate, carrying capacity and resistance to environmental change affect ecological resilience. Our study shows that adaptation of key species can shift an ecological community’s tipping point, potentially promoting ecological stability or accelerating collapse.
Human behavior
How large language models can reshape collective intelligence
Collective intelligence underpins the success of groups, organizations, markets and societies. Through distributed cognition and coordination, collectives can achieve outcomes that exceed the capabilities of individuals—even experts—resulting in improved accuracy and novel capabilities. Often, collective intelligence is supported by information technology, such as online prediction markets that elicit the ‘wisdom of crowds’, online forums that structure collective deliberation or digital platforms that crowdsource knowledge from the public. Large language models, however, are transforming how information is aggregated, accessed and transmitted online. Here we focus on the unique opportunities and challenges this transformation poses for collective intelligence. We bring together interdisciplinary perspectives from industry and academia to identify potential benefits, risks, policy-relevant considerations and open research questions, culminating in a call for a closer examination of how large language models affect humans’ ability to collectively tackle complex problems.
Global systems
Hot and cold Earth through time
Reconstructing ancient Earth’s temperature reveals a global climate regulation system:
What was Earth’s temperature tens to hundreds of millions of years ago? The planet has gone through different periods, some with extensive polar ice caps and others being completely ice-free. Estimating past global temperature is important for understanding the history of life on Earth, for predicting future climate, and more broadly, to inform the search for other habitable planets. However, there have been major disagreements about whether there has been an overall decline in Earth’s temperature over time. On page 1316 of this issue, Judd et al. (1) report a new reconstruction of Earth’s temperature over the last 485 million years by combining climate models with geological data. In contrast to some estimates, they conclude that global warm periods were maintained in similar temperature ranges. This corroborates recent predictions by Isson and Rauzi (2) from a large data compilation of different geological samples, establishing a wider agreement.
A 485-million-year history of Earth’s surface temperature
A long-term record of global mean surface temperature (GMST) provides critical insight into the dynamical limits of Earth’s climate and the complex feedbacks between temperature and the broader Earth system. Here, we present PhanDA, a reconstruction of GMST over the past 485 million years, generated by statistically integrating proxy data with climate model simulations. PhanDA exhibits a large range of GMST, spanning 11° to 36°C. Partitioning the reconstruction into climate states indicates that more time was spent in warmer rather than colder climates and reveals consistent latitudinal temperature gradients within each state. There is a strong correlation between atmospheric carbon dioxide (CO2) concentrations and GMST, identifying CO2 as the dominant control on variations in Phanerozoic global climate and suggesting an apparent Earth system sensitivity of ~8°C.
Oxygen isotope ensemble reveals Earth’s seawater, temperature, and carbon cycle history
Earth’s persistent habitability since the Archean remains poorly understood. Using an oxygen isotope ensemble approach—comprising shale, iron oxide, carbonate, silica, and phosphate records—we reconcile a multibillion-year history of seawater δ18O, temperature, and marine and terrestrial clay abundance. Our results reveal a rise in seawater δ18O and a temperate Proterozoic climate distinct to interpretations of a hot early Earth, indicating a strongly buffered climate system. Precambrian sediments are enriched in marine authigenic clay, with prominent reductions occurring in concert with Paleozoic and Cenozoic cooling, the expansion of siliceous life, and the radiation of land plants. These findings support the notion that shifts in the locus and extent of clay formation contributed to seawater 18O enrichment, clement early Earth conditions, major climate transitions, and climate stability through the reverse weathering feedback.