Update README.md
Browse files
README.md
CHANGED
@@ -229,7 +229,7 @@ library_name: transformers
|
|
229 |
|
230 |
This is [DeBERTa-v3-base](https://hf.co/microsoft/deberta-v3-base) fine-tuned with multi-task learning on 560 tasks of the [tasksource collection](https://github.com/sileod/tasksource/)
|
231 |
This checkpoint has strong zero-shot validation performance on many tasks (e.g. 70% on WNLI), and can be used for zero-shot NLI pipeline (similar to bart-mnli but better).
|
232 |
-
You can also load other tasks (see next paragraph) or further fine-tune the encoder for new classification, token classification or multiple-choice
|
233 |
|
234 |
# Tasksource-adapters: 1 line access to 500 tasks
|
235 |
|
|
|
229 |
|
230 |
This is [DeBERTa-v3-base](https://hf.co/microsoft/deberta-v3-base) fine-tuned with multi-task learning on 560 tasks of the [tasksource collection](https://github.com/sileod/tasksource/)
|
231 |
This checkpoint has strong zero-shot validation performance on many tasks (e.g. 70% on WNLI), and can be used for zero-shot NLI pipeline (similar to bart-mnli but better).
|
232 |
+
You can also load other tasks (see next paragraph) or further fine-tune the encoder for new task (classification, token classification or multiple-choice).
|
233 |
|
234 |
# Tasksource-adapters: 1 line access to 500 tasks
|
235 |
|