Skip to content

Studying the particle collisions – Solvd at ECAI 2025 

AI & data engineering

The ECAI (European Conference on Artificial Intelligence) is a leading Artificial Intelligence (AI) event in Europe and is recognized as a major event worldwide. The event has been held since 1992 every two years in varying European cities, including Vienna, Austria, Kraków, Poland and Valencia, Spain. 

This year, Solvd’s team is joining the ECAI with one of its newest research and a new solution – ExpertSim. 

ExpertSim: fast particle detector simulation using a mixture of generative experts

Particle colliders like the LHC (Large Hadron Collider) or RHIC (Relativistic Heavy Ion Collider) are research devices in particle physics that accelerate atomic nuclei or electrons at near-light speed to collide them with other particles. By doing it, new particles may be detected by the particle detector where the collision takes place. The sensors gather data from the collision.  

This research lets physicists get insight into the fabric of the universe on the super-microscopic level. 

Challenges with the current approach 

Particle accelerators use tremendous amounts of energy to run experiments. The European Organization for Nuclear Research (CERN, initially known as Conseil Européen pour la Recherche Nucléaire) uses up to 1.3 TWh yearly, an equivalent of yearly consumption of 370,000 houses in Spain.  

Running experiments in the accelerator is based on checking if the mathematical predictions based on a particular hypothesis produce results that fit with observations. To do so, it is necessary to run the simulations beforehand. Currently used simulation methods are based on statistical Monte Carlo methods, which put a heavy burden on the computing infrastructure of the research facility. In 2023, there were over 540,000 CPU devices engaged in computations for experiments. This scale and power consumption make them extremely costly.   

To tackle this challenge, Solvd’s researchers decided to use generative AI. 

Solvd’s approach – ExpertSim 

LHC and comparable devices produce extreme amounts of data to process. Yet this type of data is not following standard distributions seen, for example, in computer vision applications. Thus, applying AI to these simulations requires a 

significantly different approach. In this particular case, the research team decided to use the Mixture of Experts.  

What is Mixture of Experts 

The Mixture of Experts approach delivers a neural network that consists of “expert” modules, and the gating network that determines whether an input should be processed by the selected “expert” inside. These experts can be designed to work with different tasks within a single domain. A good (but severely simplified) example may be the neural network controlling cameras in an autonomous car, where separate experts recognize road signs, vehicles and landmarks.  

In the context of LHC, the Mixture of Experts approach leverages the high diversity of inputs that can be encountered by the neural network. The proposed model selects a specialized expert to enable fast processing of the input information without compromising accuracy.  

The outcome

ExpertSim outperforms all existing approaches and achieves the highest simulation fidelity. The proposed approach improved by 15% compared to previous state-of-the-art solutions. This work is a great example that this approach can bring great savings and accuracy in high-energy physics experiments.  

The research was delivered by Patryk Będkowski, Jan Dubiski, Filip Szatkowski, Kamil Deja, Przemysław Rokita and Tomasz Trzicski, representing Warsaw University of Technology, NASK National Research Institute, IDEAS NCBR and IDEAS Research Institute. The research paper can be found on Arxiv