Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Richard Csaky

DPhil student

Modelling and decoding noninvasive brain signals with deep learning


I have obtained my PhD under the supervision of Mark Woolrich, Oiwi Parker Jones, and Mats van Es at the Oxford Centre for Human Brain Activity. My research was at the intersection of machine learning and neuroscience. I have conducted analysis, modelling, and decoding of EEG and MEG data using standard signal processing and machine learning techniques. I also developed novel deep learning approaches for such data. I am especially interested in decoding language (e.g. reading / inner speech), and potential applications in brain-computer interfaces (BCI). I am also more and more interested in novel invasive/noninvasive BCI technologies.


I completed a Computer Science M.S. and a Mechatronics B.S. at the Budapest University of Technology. I was involved in dialogue modeling research for 3 years under the supervision of Gabor Recski, and I also did computer vision research at Robert Bosch.

Research focus

My research spans several topics.

1. Improving supervised/unsupervised group-level models of M/EEG data by dealing with between-subject variability.

2. Analysing how deep learning can be leveraged for decoding M/EEG data primarily from visual tasks.

3. Developing deep (transfer) learning models for simultaneous forecasting and stimulus decoding from M/EEG data.

4. Collecting a high number of trials of both EEG and MEG data from a few participants doing a reading and inner speech task. Analysing this data with the decoding methods mentioned above.