Scaling Properties For ANN Models of C. elegans

Date:

Yang Tan Collective at MIT

Abstract: The nematode worm C. elegans enables straightforward optical measurement of neural activity, presenting a unique platform for exploring intrinsic neural dynamics. This paper investigates the scaling properties essential for self-supervised neural activity prediction based on past neural data, omitting behavioral aspects. Specifically, we investigate how predictive accuracy, quantified by the mean squared error (MSE), scales with the amount of training data, considering variables such as the number of neurons recorded, recording duration, and diversity of datasets. We also examine the relationship between these scaling properties and various parameters of artificial neural network models (ANNs), including size, architecture, and hyperparameters. Employing the nervous system of C. elegans as an experimental platform, we elucidate the critical influence of data volume and model complexity in self-supervised neural prediction, demonstrating a logarithmic decrease in the MSE with an increase in the amount of training data, consistent across diverse datasets. Additionally, we observe nonlinear changes in MSE as the size of the ANN model varies. These findings emphasize the need for enhanced high-throughput tools for extended imaging of entire mesoscale nervous systems to acquire sufficient data for developing highly accurate ANN models of neural dynamics, with significant implications for systems neuroscience and biologically-inspired AI.