Tiny parameters leverage big models. OpenDelta performs parameter-efficient tuning for big models. By only updating very few parameters (less than 5%), the algorithms can achieve the same effect with full-parameter fine-tuning.
Easy Usage
A parameter-efficient delta tuning framework can be deployed with just a few lines of code, without pulling the model source code.
Tool Collaboration
OpenDelta can run in collaboration with OpenPrompt to jointly implement the latest paradigm for the adaptation of of pre-trained models.
Visual Operation
OpenDelta's integrated visualisation module allows easy manipulation within the model.
Supported Models
OpenDelta is compatible with the current mainstream language models, even Visual Transformer  (ViT).