File size: 336 Bytes
bbac631
 
661981a
 
 
 
 
bbac631
661981a
1
2
3
4
5
6
7
8
9
10
---
library_name: transformers
license: mit
datasets:
- HuggingFaceH4/ultrafeedback_binarized
language:
- en
---
This is a model released from the preprint: *[Bootstrapping Language Models with DPO Implicit Rewards](https://arxiv.org/abs/2406.09760)*. Please refer to our [repository](https://github.com/sail-sg/dice) for more details.