Brain Signal Quantification and Functional Unit Analysis in Fluorescent Imaging Data by Unsupervised Learning

TR Number



Journal Title

Journal ISSN

Volume Title


Virginia Tech


Optical recording of various brain signals is becoming an indispensable technique for biological studies, accelerated by the development of new or improved biosensors and microscopy technology. A major challenge in leveraging the technique is to identify and quantify the rich patterns embedded in the data. However, existing methods often struggle, either due to their limited signal analysis capabilities or poor performance. Here we present Activity Quantification and Analysis (AQuA2), an innovative analysis platform built upon machine learning theory. AQuA2 features a novel event detection pipeline for precise quantification of intricate brain signals and incorporates a Consensus Functional Unit (CFU) module to explore interactions among potential functional units driving repetitive signals. To enhance efficiency, we developed BIdirectional pushing with Linear Component Operations (BILCO) algorithm to handle propagation analysis, a time-consuming step using traditional algorithms. Furthermore, considering user-friendliness, AQuA2 is implemented as both a MATLAB package and a Fiji plugin, complete with a graphical interface for enhanced usability. AQuA2's validation through both simulation and real-world applications demonstrates its superior performance compared to its peers. Applied across various sensors (Calcium, NE, and ATP), cell types (astrocytes, oligodendrocytes, and neurons), animal models (zebrafish and mouse), and imaging modalities (two-photon, light sheet, and confocal), AQuA2 consistently delivers promising results and novel insights, showcasing its versatility in fluorescent imaging data analysis.



AQuA2, brain activity, interaction analysis, functional units, time-lapse imaging, machine learning