We are pleased to announce that our article has been published in Information Sciences (IF: 8.1). The manuscript was submitted on January 27, 2024 and accepted on May 13, 2024.

📌The PDF of the article can be downloaded freely: https://doi.org/10.1016/j.ins.2024.120751

Title: Few-shot class incremental learning via robust transformer approach

Abstract:
Few-Shot Class-Incremental Learning (FSCIL)presents an extension of the Class Incremental Learning (CIL)problem where a model is faced with the problem of data scarcity while addressing the Catastrophic Forgetting (CF)problem. This problem remains an open problem because all recent works are built upon the Convolutional Neural Networks (CNNs)performing sub-optimally compared to the transformer approaches. Our paper presents Robust Transformer Approach (ROBUSTA)built upon the Compact Convolutional Transformer (CCT). The issue of overfitting due to few samples is overcome with the notion of the stochastic classifier, where the classifier’s weights are sampled from a distribution with mean and variance vectors, thus increasing the likelihood of correct classifications, and the batch-norm layer to stabilize the training process. The issue of CFis dealt with the idea of delta parameters, small task-specific trainable parameters while keeping the backbone networks frozen. A non-parametric approach is developed to infer the delta parameters for the model’s predictions. The prototype rectification approach is applied to avoid biased prototype calculations due to the issue of data scarcity. The advantage of ROBUSTAis demonstrated through a series of experiments in the benchmark problems where it is capable of outperforming prior arts with big margins without any data augmentation protocols.