Professor in Particle Physics
What are your research interests?
I am a professor of particle physics at the University of Manchester. My research aims to find out more about the constituents of matter, and I am equally motivated by the challenges related to the “big science” needed to study them. The Large Hadron Collider is the perfect scientific environment to combine the two: together with my team and collaborators, our goal is to discover new particles in a data-rich research environment. My work is on the interface of high energy physics and software/computing, and my specific research interests are real-time analysis using compute accelerators, data compression, software sustainability and carbon impact, first steps into quantum machine learning.
What is the focus of your current research?
We are in the thick of the third data taking period for the Large Hadron Collider. After collecting the first data with the new data taking algorithms we put in place, we can now use this data to search for and hopefully discover new particles!
I will soon start working on a software excellence framework for science in Europe, and I’m looking forward to collaborating with the Software Sustainability Institute here at Manchester, who are also key players in the project.
What are some projects or breakthroughs you wish to highlight?
We are working on a machine-learning based compression algorithm and on its carbon footprint-based optimisation, in collaboration with colleagues at Lund, Ohio State and Tel Aviv University. This is an open source project that uses datasets from multiple disciplines and soon also from industry.
With the postdocs, students and interns working with us, we are also doing carbon-footprint-based optimisation, which is a growing concern when one is dealing with large models and/or large amounts of data, towards meeting Net Zero targets.
What memberships and awards do you hold/have you held in the past?
I hold an ERC Consolidator Grant on real-time data taking and data analysis techniques at the Large Hadron Collider for new particle searches, and I coordinate the European Training Network SMARTHEP on machine learning and hybrid computing architectures for real-time analysis in particle physics, industry and society.
What is the biggest challenge in Data Science and AI right now?
FAIR (Findable, Accessible, Interoperable and Reproducible), equitable and sustainable AI.
What real world challenges do you see Data Science and AI meeting in the next 25 years?
Considering the growth of data science and AI in the past 25 years one could think the sky would be the limit! I think (or I hope!) that we will reach a solid interpretability basis for unsupervised learning, so we can use it to point to and pursue out-of-the-box solutions for the big questions of our universe.