Mammalian cells continuously sense and respond to chemical signals that vary in space and time. In order to operate in a changing environment, they carry out complex computational tasks in real time. Such processing of chemical signals by the receptors on the cell surface dynamically resembles the sensory computations of the neural microcircuits in the cerebral cortex.
We focus on developing a generic theory of computations and learning on the level of biochemical networks in single cells, by determining the underlying dynamical principles through which such information processing features emerge. We investigate how single cells employ working memory to integrate multiple time-varying signals as a means to generate stable identity, while simultaneously balancing plasticity in cellular responses. Formalizing these principles through computations with metastable states, we also explore whether single cells can learn. Using quantitative imaging, we experimentally validate the proposed conceptual basis how cells process non-stationary signals and learn to generalize their responses to a changing environment.