Examining the Dynamics of Biologically Inspired Systems Far From Equilibrium
Non-equilibrium systems have no set method of analysis, and a wide array of dynamics can be present in such systems. In this work we present three very different non-equilibrium models, inspired by biological systems and phenomena, that we analyze through computational means to showcase both the range of dynamics encompassed by these systems, as well as various techniques used to analyze them. The first system we model is a surface plasmon resonance (SPR) cell, a device used to determine the binding rates between various species of chemicals. We simulate the SPR cell and compare these computational results with a mean-field approximation, and find that such a simplification fails for a wide range of reaction rates that have been observed between different species of chemicals. Specifically, the mean-field approximation places limits on the possible resolution of the measured rates, and such an analysis fails to capture very fast dynamics between chemicals. The second system we analyzed is an avalanching neural network that models cascading neural activity seen in monkeys, rats, and humans. We used a model devised by Lombardi, Herrmann, de Arcangelis et al. to simulate this system and characterized its behavior as the fraction of inhibitory neurons was changed. At low fractions of inhibitory neurons we observed epileptic-like behavior in the system, as well as extended tails in the avalanche strength and duration distributions, which dominate the system in this regime. We also observed how the connectivity of these networks evolved under the effects of different inhibitory fractions, and found the high fractions of inhibitory neurons cause networks to evolve more sparsely, while networks with low fractions maintain their initial connectivity. We demonstrated two strategies to control the extreme avalanches present at low inhibitory fractions through either the random or targeted disabling of neurons. The final system we present is a sparsely encoding convolutional neural network, a computational system inspired by the human visual cortex that has been engineered to reconstruct images inputted into the network using a series of "patterns" learned from previous images as basis elements. The network attempts to do so "sparsely," so that the fewest number of neurons are used. Such systems are often used for denoising tasks, where noisy or fragmented images are reconstructed. We observed a minimum in this denoising error as the fraction of active neurons was varied, and observed the depth and location of this minimum to obey finite-size scaling laws that suggest the system is undergoing a second-order phase transition. We can use these finite-size scaling relations to further optimize this system by tuning it to the critical point for any given system size.