Spatiotemporal adaptive microscope control, driven by biological events
A key tool for studying the dynamics of living systems is the light microscope. Microscopes allow real-time recording of spontaneous or evoked spatio-temporal dynamics, data that can be used to develop models for how complex systems function. Today, cutting-edge microscopes can image below the diffraction limit of light (super-resolution microscopy), or over days, gently enough to allow an organism to develop and walk away (light-sheet microscopy). Yet, microscopy studies of biological systems largely rely on human control or pre-defined acquisition parameters, to identify features of interest, perturb the system, and collect data in a given location and at a given timescale. This is because subtle changes in protein dynamics and assembly patterns often herald events of interest – _too subtle and unreliable to act as inputs to existing microscope automation.
Advances in intelligent systems and adaptive control have the potential to revolutionize how microscopy data is collected, and to then enable breakthroughs in our understanding of biological systems. We propose to develop a neural network-based microscope controller that is capable of detecting image signatures related to biological activity, and in response, adapting illumination patterns at multiple locations across an imaging field of view. The proposed project aims to build upon a neural-network microscope control framework previously developed in the Manley group, to make it suitable for spatially and temporally adapted control. We will apply this to push beyond the state-of-the art, using as proof-of-concept organismal studies performed in the Oates group, and biofilm studies in the Manley group.