Cookies on this website
We use cookies to ensure that we give you the best experience on our website. If you click 'Continue' we'll assume that you are happy to receive all cookies and you won't see this message again. Click 'Find out more' for information on how to change your cookie settings.
Skip to main content

Alejo J Nevado-Holgado

MSc, MSc, PhD


Lead of Bioinformatics, Principal Investigator

I am the lead of the AI team in the TNDR (https://www.psych.ox.ac.uk/research/dementia-research-group) laboratory, formed by 22 excellent machine learners and bioinformaticians. In addition, I am the Director of Artificial Intelligence at Cristal Health, a University spin-off applying AI to data from hospitals. For both roles and groups, our focus is on the applications of machine learning and bioinformatics to mental health care.

The main technologies that we apply and develop in our laboratory are bioinformatics, artificial intelligence and high performance computing, and we apply these to data coming from biotech laboratories (i.e. genomics, transcriptomics, proteomics and metabolomics from human samples and iPSC cells) and hospitals or GP practices (i.e. Electronic Health Records and cohorts of volunteer patients). In the case of biotech data, bioinformatics methods allow us to know which are the metabolic processes associated with neurological diseases, such that appropriate pharmaceutical targets and drugs can be developed. In the case of hospital and GP data, artificial intelligence and neural networks allow us to extract from the text notes written by doctors what are the diagnoses, medications, symptoms and medical test results of millions of patients, which in turn can be analysed to evaluate the most efficient treatment per patient (personalised medicine), or whether certain drugs are serendipitously ameliorating psychiatric conditions (drug repurposing). Often, we combine biotech and hospital/GP data to validate laboratory results with real world evidence, such that any target or treatments we propose increases its chances of succeeding in the final stages of clinical trials. In all cases, high performance computing is the software (e.g. concurrent programming, threading or CUDA) and hardware (e.g. our 40 GPUs computational cluster) of choice to perform all these calculations in hours rather than years.

In summary, we believe in the benefits that information technologies can bring (and are bringing) to health care and drug discovery, and we actively work towards implementing these methods in the lab and the clinic, and on developing the very computational methods that make this possible.

Trajectory

I did my PhD in the University of Bristol, Department of Computer Science, under the supervision of Dr Rafal Bogacz and Dr John Terry. During these very interesting years, I used mathematical modelling, signal analysis and machine learning to the study of the basal ganglia and Parkinson’s disease. With these techniques we investigated which anomalies were generating the patterns of neuronal activity recorded by experimental groups, like our collaborator Dr Peter Magill and his team.

After some experimental training in Cambridge, now I am again applying machine learning and bioinformatics to the study of neurodegeneration, but this time to investigate biomarkers and the metabolic network in these diseases. It is known that Alzheimer's and Parkinson's disease have a very long prodromal course, which means that some underlying cause gradually destroys brain tissue, not being the condition diagnosed until 20 years later, when most of the brain is lost and unrecoverable. Therefore, any technique aiming at stopping the advance of these diseases, needs first to be able to detect it in the prodromal stage. Traditional analysis approaches haven't been able to do so yet, although recent technological developments may change this luck.

For instance, a very large amount of medical and biological data has been produced during these decades of research, but this data is scattered across many institutes and hospitals, and its size makes it impossible to be analysed by a person in the classical way. We are aiming at first linking all these data together, and then thoroughly analysing it with machine learning and artificial intelligence approaches, which can make sense of data when it is beyond human interpretation due to size and complexity. We think this approach, which is proving of great success in high tech industry, has the best chances at detecting neurodegeneration in its prodromal stage, and helping us understand how to modify its course and avoid brain damage.