Skip to content
/ SAMCL Public

[AAAI 2026 Oral] SAMCL: Empowering SAM to Continually Learn from Dynamic Domains with Extreme Storage Efficiency

Notifications You must be signed in to change notification settings

INV-WZQ/SAMCL

Repository files navigation

SAMCL: Empowering SAM to Continually Learn from Dynamic Domains with Extreme Storage Efficiency

SAMCL: Empowering SAM to Continually Learn from Dynamic Domains with Extreme Storage Efficiency 🥯[Arxiv]
Zeqing Wang12, Kangye Ji13, Di Wang1, Haibin Zhang1, Fei Cheng1
1 Xidian University
2 National University of Singapore
3 Tsinghua University

📚 TL;DR (Too Long; Didn't Read)

SAMCL empowers SAM with the continual learning (CL) ability across dynamic domains. At a high level, SAMCL decomposes incremental knowledge into separate modules and trains a selector to choose the appropriate one during inference. To tackle both effectiveness and storage efficiency, we introduce two components for SAMCL: AugModule and Module Selector.

🛠️ Implementation

Environment

conda create -n SAMCL python=3.10
conda activate SAMCL

pip install -r requirements.txt

Datasets

All training datasets are stored in the following form (Take Kvasir-SEG dataset as an example):

data/Kvasir/
├── test/
    ├── images/
        ├── 0.png
        ...

    ├── masks/
        ├── points.json
        ├── 0.png
        ...
├── train
    ├── images/
        ├── 0.png
        ...

    ├── masks/
        ├── points.json
        ├── 0.png
        ...

All datasets need download from their official webset and then process to the above format. We also provide the datasets used in our experiments at link.

points.json stored the static points of each instance, which is used in main papaer. You can run generate_point.py for generating new points.

Checkpoint

Download official checkpoint of Vit-B version SAM and checkpoint of tiny version SAM2.1 to checkpoint/.

There are already pre-trained SAMCL checkpoints with SAM and SAM2 in checkpoint/ for simple testing.

Usage

  • Simple Testing
# SAMCL with SAM
python test.py --module SAMCL --cuda 0

# SAMCL with SAM2.1
python test_SAM2.py --module SAMCL_2 --cuda 0
  • Training in CL manner
# Distributed Training
CUDA_VISIBLE_DEVICES=0,1,2,3 torchrun --nnodes 1 --nproc_per_node 4 --master_port=2412 train.py --module SAMCL --batch_size 2 --cuda -1

CUDA_VISIBLE_DEVICES=0,1,2,3 torchrun --nnodes 1 --nproc_per_node 4 --master_port=2412 train_SAM2.py --module SAMCL_2 --batch_size 4 --cuda -1


# Single GPU
python train.py --module SAMCL --batch_size 8 --cuda 0 

python train_SAM2.py --module SAMCL_2 --batch_size 16 --cuda 0 

🤓 Acknowledgments

Our continual learning is based on SAM. We extend our gratitude to the community for their valuable contributions!

🔗 Citation

@misc{wang2025samclempoweringsamcontinually,
      title={SAMCL: Empowering SAM to Continually Learn from Dynamic Domains with Extreme Storage Efficiency}, 
      author={Zeqing Wang and Kangye Ji and Di Wang and Haibin Zhang and Fei Cheng},
      year={2025},
      eprint={2412.05012},
      archivePrefix={arXiv},
      primaryClass={cs.CV},
      url={https://arxiv.org/abs/2412.05012}, 
}

About

[AAAI 2026 Oral] SAMCL: Empowering SAM to Continually Learn from Dynamic Domains with Extreme Storage Efficiency

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published