FeTrIL++: Feature Translation for Exemplar-Free Class-Incremental Learning with Hill-Climbing
Exemplar-free class-incremental learning (EFCIL) poses significant challenges, primarily
due to catastrophic forgetting, necessitating a delicate balance between stability and
plasticity to accurately recognize both new and previous classes. Traditional EFCIL
approaches typically skew towards either model plasticity through successive fine-tuning or
stability by employing a fixed feature extractor beyond the initial incremental state. Building
upon the foundational FeTrIL framework, our research extends into novel experimental …
due to catastrophic forgetting, necessitating a delicate balance between stability and
plasticity to accurately recognize both new and previous classes. Traditional EFCIL
approaches typically skew towards either model plasticity through successive fine-tuning or
stability by employing a fixed feature extractor beyond the initial incremental state. Building
upon the foundational FeTrIL framework, our research extends into novel experimental …
Exemplar-free class-incremental learning (EFCIL) poses significant challenges, primarily due to catastrophic forgetting, necessitating a delicate balance between stability and plasticity to accurately recognize both new and previous classes. Traditional EFCIL approaches typically skew towards either model plasticity through successive fine-tuning or stability by employing a fixed feature extractor beyond the initial incremental state. Building upon the foundational FeTrIL framework, our research extends into novel experimental domains to examine the efficacy of various oversampling techniques and dynamic optimization strategies across multiple challenging datasets and incremental settings. We specifically explore how oversampling impacts accuracy relative to feature availability and how different optimization methodologies, including dynamic recalibration and feature pool diversification, influence incremental learning outcomes. The results from these comprehensive experiments, conducted on CIFAR100, Tiny-ImageNet, and an ImageNet-Subset, under-score the superior performance of FeTrIL in balancing accuracy for both new and past classes against ten contemporary methods. Notably, our extensions reveal the nuanced impacts of oversampling and optimization on EFCIL, contributing to a more refined understanding of feature-space manipulation for class incremental learning. FeTrIL and its extended analysis in this paper FeTrIL++ pave the way for more adaptable and efficient EFCIL methodologies, promising significant improvements in handling catastrophic forgetting without the need for exemplars.
arxiv.org
Showing the best result for this search. See all results