Prof. Mackes Lab has moved.
We seek to understand how populations of neurons collectively process sensory input, perform computations and control behavior. To this end, we develop statistical models and machine learning algorithms for neural and behavioral data analysis, and collaborate with experimental laboratories performing measurements of neural activity and behavior.
At a technical level, we tackle this problem by developing statistical models and machine algorithms for neural data analysis. Modern experimental techniques allow unprecedented insights into the structure and function of neural circuits. These advances open the possibility of studying the statistical structure of neural activity in large populations of neurons, and of using these insights in clinical applications such as neural prosthetics. However, understanding the complex data generated by neurophysiological experiments is a challenging task that requires powerful statistical methods.
Recordings of neural population activity yield high-dimensional time-series with rich and dynamically changing statistical structure, and can be hard to visualize and interpret. Because of factors such as neural plasticity as well as experimental challenges, these data can be non-stationary in nature and exhibit correlations across temporal and spatial scales. Although traditional analyses based on single neurons and ‘off-the shelf’ data-analysis techniques have yielded important insights into neural computation, they are ultimately limited, as they fail to fully capture the rich structure of neural population activity. To make real progress towards understanding computation in the brain, we therefore need powerful machine-learning methods that are adapted to the specific characteristics of neural data.
We develop, apply and analyze statistical methods for neural data analysis. Using Bayesian statistics as a framework, we build probabilistic models that combine prior knowledge about neural connectivity and response properties with the observed experimental data and lead to more realistic descriptions of neural population dynamics. In particular, we seek to build statistical methods for inferring internal states and functional networks from recordings of neural population activity. More generally, we aim to provide computational tools which have a thorough theoretical grounding, work robustly and efficiently on realistic data, and which can readily be used in a wide range of scientific and clinical contexts.
Understanding how neurons collectively represent the sensory input, perform computations and guide behavior is one of the central goals of neuroscience. While the importance of studying populations of neurons has been recognized for decades, the experimental and theoretical tools to empirically investigate the computational properties of neural populations had been lacking till recently. It is now becoming increasingly clear that information processing in the brain is highly state-dependent. In other words, both neural activity and behavior are not entirely determined by the external stimulus alone, but can be highly modulated by internal states and endogeneously generated dynamics. Cognitive processes such as attention or behavioral states can lead to widespread modulations of neural excitability. Intrinsic neuronal properties such as synaptic plasticity can make neuronal responses dependent on the recent activity of the network. Thus, both behavior and neural activity are strongly influenced by dynamic changes in the internal states of cortical networks, and can be very different from one trial to the next. We seek to get a better understanding of how such internal states and processes influence both neural activity and behavior, and to characterize their implications for neural information processing.