Using DUO

To use the pre-trained model for masked language modeling, use the following snippet:

from transformers import AutoModelForMaskedLM, AutoTokenizer

# See the `MDLM` collection page on the hub for list of available models.
tokenizer = transformers.AutoTokenizer.from_pretrained('gpt2')
model = AutoModelForMaskedLM.from_pretrained('s-sahoo/duo-distilled')

For a hands-on example, check out this Colab notebook. For more information and implementation details, visit our github repository: DUO

Model Details

The model, which has a context length of 1024 and is similar in size to GPT2-medium with approximately 130 million non-embedding parameters, was trained for 1M steps on the OpenWebText corpus.

For more details, please see our paper: The Diffusion Duality.

Citation

Please cite our work using the bibtex below:

BibTeX:

@inproceedings{
sahoo2025the,
title={The Diffusion Duality},
author={Subham Sekhar Sahoo and Justin Deschenaux and Aaron Gokaslan and Guanghan Wang and Justin T Chiu and Volodymyr Kuleshov},
booktitle={ICLR 2025 Workshop on Deep Generative Model in Machine Learning: Theory, Principle and Efficacy},
year={2025},
url={https://openreview.net/forum?id=CB0Ub2yXjC}
}

Model Card Contact

Subham Sekhar Sahoo ([email protected])

Downloads last month
28
Safetensors
Model size
170M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train s-sahoo/duo-distilled

Collection including s-sahoo/duo-distilled