File size: 2,177 Bytes
284a783 d27ae95 284a783 d27ae95 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 |
---
license: apache-2.0
language:
- en
base_model:
- meta-llama/Llama-3.1-8B
library_name: transformers
pipeline_tag: text-generation
---
# UltraIF-8B-SFT
## Links π
UltraIF model series and data are available at π€ HuggingFace.
- π€ [UltraComposer](https://huggingface.co/bambisheng/UltraIF-8B-UltraComposer)
- π [SFT Data](https://huggingface.co/datasets/kkk-an/UltraIF-sft-175k) and [SFT Model](https://huggingface.co/bambisheng/UltraIF-8B-SFT)
- βοΈ [DPO Data](https://huggingface.co/datasets/kkk-an/UltraIF-dpo-20k) and [DPO Model](https://huggingface.co/bambisheng/UltraIF-8B-DPO)
Also check out our π [Paper](https://arxiv.org/abs/2502.04153) and π»[code](https://github.com/kkk-an/UltraIF)
## Model Description
UltraIF-8B-SFT is fine-tuned from [Llama-3.1-8B](https://huggingface.co/meta-llama/Llama-3.1-8B), using 175k [UltraIF SFT Data](https://huggingface.co/datasets/kkk-an/UltraIF-sft-175k).
## Introduction of UltraIF
UltraIF first constructs the **UltraComposer** by decomposing user instructions into simplified ones and constraints, along with corresponding evaluation questions. This specialized composer facilitates the synthesis of instructions with more complex and diverse constraints, while the evaluation questions ensure the correctness and reliability of the generated responses.
Then, we introduce the **Generate-then-Evaluate** process. This framework first uses UltraComposer to incorporate constraints into instructions and then evaluates the generated responses using corresponding evaluation questions covering various quality levels.

## Usage
You can use the same chat template as Llama-3.1-8B-Instruct to interact with UltraIF-8B-SFT.
## Reference
<br> **π If you find our projects helpful to your research, please consider citing:** <br>
```
@article{an2025ultraif,
title={UltraIF: Advancing Instruction Following from the Wild},
author={An, Kaikai and Sheng, Li and Cui, Ganqu and Si, Shuzheng and Ding, Ning and Cheng, Yu and Chang, Baobao},
journal={arXiv preprint arXiv:2502.04153},
year={2025}
}
``` |