Skip to content

GUIDE for incremental learning – Solvd at ECAI 2025

AI & data engineering

The European Conference on Artificial Intelligence (ECAI) lets Artificial Intelligence (AI) researchers and practitioners connect and exchange ideas and thoughts on the directions AI should or may evolve. 

Solvd’s research team proposed a solution to tackle catastrophic forgetting by delivering a system that acts like a teacher and mentor for a retrained neural network. 

Guidance-based incremental learning with diffusion models  

The most popular approach in building machine learning models requires a dataset (the bigger, the better) and training that builds the skill in the neural network. Yet the world is not a static place, so it is necessary to change or update the skills within the network from time to time.   

The common approach is to prepare the new network from scratch and replace the existing one. This comes with two main drawbacks:  

  • The new network may underperform compared to the previous one. Training is non-deterministic, so every session may produce a slightly different output.  
  • The process is expensive. Every training requires power, time and human oversight.  

The best response to the challenges listed above is, so far, continual learning.  

What is continual learning  

Continual learning aims to update the neural network rather than replace it. It can be compared to lifelong learning for humans – one can learn a new skill or polish an existing one by practicing. With enough patience and dedication, one can have a plethora of skills, like driving a car, playing a guitar, performing brain surgery or woodworking.   

But as people tend to slowly forget unused skills, artificial neural networks face a different, way more severe challenge – catastrophic forgetting.   

What is catastrophic forgetting 

Catastrophic Forgetting applies to the situation when the neural network retrained using the continual learning techniques loses the ability to perform tasks that were possible before. Following the example above, one would forget how to drive the car when learning how to play guitar. This drawback makes continual learning techniques challenging to apply. And that’s where Solvd’s team decided to challenge the status quo.  

Solvd’s approach – artificial teacher  

To tackle the Catastrophic Forgetting, Solvd’s team introduces GUIDE, a novel continual learning approach that uses diffusion models to generate samples for the model that correspond to the skills that are at risk of being forgotten in the continual learning process.   

The idea behind the system is based on repetition in the learning process – the foundation for gaining new skills and polishing existing ones. If the continual learning process is enriched with repetition of critical existing skills, catastrophic forgetting is mitigated.   

The outcome  

GUIDE outperforms other state-of-the-art generative replay methods, significantly reducing catastrophic forgetting in class-incremental learning.  

The research was delivered by Bartosz Cywiński, Kamil Deja, Tomasz Trzciński, Bartłomiej Twardowski and Łukasz Kuciński, representing Universitat Autònoma de Barcelona, University of Warsaw and Polish Academy of Sciences. The research paper can be found on Arxiv