If you find Complexity Thoughts, click on the Like button, leave a comment, repost on Substack or share this post. It is the only feedback I can have for this free service.
The frequency and quality of this newsletter relies on social interactions. Thank you!
Dall-e 3 representation of this issue’s content
In a nutshell
Let’s start again after a short break for the first birthday of Complexity Thoughts!
In this issue we have another eclectic mix of papers, starting with the renormalization group's 50th anniversary and its crucial role in physics and complex systems science. Particularly intriguing is Yuhai Tu's work on non-equilibrium systems, discussing the RG beyond its traditional equilibrium borders. Shifting to ecosystems, we have reactivity, not just stability, as a key measure in complex ecological systems. The authors demonstrate how ecosystems respond immediately after disturbances, suggesting that reactivity might even surpass stability in predicting extinction risks. Another collective behavior, synchronization, is explored for systems with large heterogeneity, offering insights into stability in realistic settings. Turning to urban systems, the authors present simple spatial scaling rules that explain complex city dynamics, by proposing a model that links population, roads, and socioeconomic interactions, backed by empirical data from various cities. For bio-inspired computing it’s a great period: one paper discusses the synchronization of mobile reservoir computing oscillators within networks, revealing patterns that resemble natural phenomena like bifurcation and intermittent synchronization. Another paper explores the chaotic dynamics of recurrent neural networks through Lyapunov spectra, showing how chaos in these networks can be quantified and controlled, providing vital insights for both biological and artificial intelligence applications.
Complex systems and network science foundations
Nature Physics published a special Focus collection about the 50th anniversary of Kenneth Wilson's work on the renormalization group, one of the most important frameworks in physics and beyond. In complex systems it is still a tough problem, although it has been tackled by several outstanding works in network science. I recommend to read one of our recent reviews on Network Geometry, where the RG is discussed under the lens of network science.
I have especially appreciated the piece by Yuhai Tu: The renormalization group for non-equilibrium systems. In fact, most RG work has been done for equilibrium systems: Tu covers some facts from studying non-equilibrium flocking using the RG.
Collective behavior
Reactivity of complex communities can be more important than stability
Reactivity as a new quantity for the analysis of complex ecological systems, beyond stability:
Understanding stability—whether a community will eventually return to its original state after a perturbation—is a major focus in the study of various complex systems, particularly complex ecosystems. Here, we challenge this focus, showing that short-term dynamics can be a better predictor of outcomes for complex ecosystems. Using random matrix theory, we study how complex ecosystems behave immediately after small perturbations. Our analyses show that many communities are expected to be ‘reactive’, whereby some perturbations will be amplified initially and generate a response that is directly opposite to that predicted by typical stability measures. In particular, we find reactivity is prevalent for complex communities of mixed interactions and for structured communities, which are both expected to be common in nature. Finally, we show that reactivity can be a better predictor of extinction risk than stability, particularly when communities face frequent perturbations, as is increasingly common. Our results suggest that, alongside stability, reactivity is a fundamental measure for assessing ecosystem health.
Synchronization in networked systems with large parameter heterogeneity
Systems that synchronize in nature are intrinsically different from one another, with possibly large differences from system to system. While a vast part of the literature has investigated the emergence of network synchronization for the case of small parametric mismatches, we consider the general case that parameter mismatches may be large. We present a unified stability analysis that predicts why the range of stability of the synchronous solution either increases or decreases with parameter heterogeneity for a given network. We introduce a parametric approach, based on the definition of a curvature contribution function, which allows us to estimate the effect of mismatches on the stability of the synchronous solution in terms of contributions of pairs of eigenvalues of the Laplacian. For cases in which synchronization occurs in a bounded interval of a parameter, we study the effects of parameter heterogeneity on both transitions (asynchronous to synchronous and synchronous to asynchronous.).
Urban systems
Simple spatial scaling rules behind complex cities
An interesting contribution (from 2017) to the science of cities and the emergence of observed scaling laws:
Although most of wealth and innovation have been the result of human interaction and cooperation, we are not yet able to quantitatively predict the spatial distributions of three main elements of cities: population, roads, and socioeconomic interactions. By a simple model mainly based on spatial attraction and matching growth mechanisms, we reveal that the spatial scaling rules of these three elements are in a consistent framework, which allows us to use any single observation to infer the others. All numerical and theoretical results are consistent with empirical data from ten representative cities. In addition, our model can also provide a general explanation of the origins of the universal super- and sub-linear aggregate scaling laws and accurately predict kilometre-level socioeconomic activity. Our work opens a new avenue for uncovering the evolution of cities in terms of the interplay among urban elements, and it has a broad range of applications.
Bio-inspired computing (and collective behavior)
Synchronization of multiple mobile reservoir computing oscillators in complex networks
What can happen if we consider agent-like behavior where an agent is.. a reservoir computing machine?
We investigate collective behavior of multiply moving reservoir computing oscillators. Each moving oscillator is fitted to a same dynamical system and then wanders a network navigating by random walk strategy. Interestingly, we find that the moving oscillators gradually present a coherent rhythmic behavior when their number is large enough. In this situation, temporal and spatial characteristics of each moving oscillator are in excellent agreement with that of their learned dynamical system under consideration. Remarkably, we show that these reservoir computing oscillators can exhibit significantly distinct collective behaviors that resemble bifurcation phenomenon when changing a critical reservoir parameter. Furthermore, we find that when studying a continuous chaotic system, intermittent synchronization emerges among these reservoir computing oscillators. But there is no evidence of enhanced correlation in the associated laminar length sequence. Our work provides a feasible framework for studying synchronization phenomena of mobile agents in nature when merely observed information is available.
(Physics of) Neuroscience
Lyapunov spectra of chaotic recurrent neural networks
Thematically speaking, this article is about bio-inspired computing too, but it is also an article within the collection Physics of Neuroscience in Phys. Rev. Research. The analytical methodology is very interesting and the results shed light on why RNN work:
Recurrent networks are widely used as models of biological neural circuits and in artificial intelligence applications. Mean-field theory has been used to uncover key properties of recurrent network models such as the onset of chaos and their largest Lyapunov exponents, but quantities such as attractor dimension and Kolmogorov-Sinai entropy have thus far remained elusive. We calculate the complete Lyapunov spectrum of recurrent neural networks and show that chaos in these networks is extensive with a size-invariant Lyapunov spectrum and attractor dimensions much smaller than the number of phase space dimensions. The attractor dimension and entropy rate increase with coupling strength near the onset of chaos but decrease far from the onset, reflecting a reduction in the number of unstable directions. We analytically approximate the full Lyapunov spectrum using random matrix theory near the onset of chaos for strong coupling and discrete-time dynamics. We show that a generalized time-reversal symmetry of the network dynamics induces a point symmetry of the Lyapunov spectrum reminiscent of the symplectic structure of chaotic Hamiltonian systems. Temporally fluctuating input can drastically reduce both the entropy rate and the attractor dimension. We lay out a comprehensive set of controls for the accuracy and convergence of Lyapunov exponents. For trained recurrent networks, we find that Lyapunov spectrum analysis quantifies error propagation and stability achieved by different learning algorithms. Our methods apply to systems of arbitrary connectivity and highlight the potential of Lyapunov spectrum analysis as a diagnostic for machine learning applications of recurrent networks.
Oldies but goodies
Selection by Consequences (1981)
Selection by consequences is a causal mode found only in living things, or in machines made by living things. It was first recognized in natural selection, but it also accounts for the shaping and maintenance of the behavior of the individual and the evolution of cultures. In all three of these fields, it replaces explanations based on the causal modes of classical mechanics. The replacement is strongly resisted. Natural selection has now made its case, but similar delays in recognizing the role of selection in the other fields could deprive us of valuable help in solving the problems which confront us.
Book of the month
It’s been a while since my last recommendation for a book. This one from Chris is definitely on my to-read list, since it promises to explain why information is the unifying principle that allows us to understand the evolution of complexity in nature. It’s going to appear in early 2024, therefore stay tuned.