Please check your email address for a confirmation link Send confirmation email
Tiny parameters leverage big models. OpenDelta performs parameter-efficient tuning for big models. By only updating very few parameters (less than 5%), the algorithms can achieve the same effect with full-parameter fine-tuning.
Share on WeChat
A parameter-efficient delta tuning framework can be deployed with just a few lines of code, without pulling the model source code.
OpenDelta can run in collaboration with OpenPrompt to jointly implement the latest paradigm for the adaptation of of pre-trained models.
OpenDelta's integrated visualisation module allows easy manipulation within the model.
OpenDelta is compatible with the current mainstream language models, even Visual Transformer (ViT).