DUO
Collection
The Diffusion Duality
•
4 items
•
Updated
To use the pre-trained model for masked language modeling, use the following snippet:
from transformers import AutoModelForMaskedLM, AutoTokenizer
# See the `MDLM` collection page on the hub for list of available models.
tokenizer = transformers.AutoTokenizer.from_pretrained('gpt2')
model = AutoModelForMaskedLM.from_pretrained('s-sahoo/duo-distilled')
For a hands-on example, check out this Colab notebook. For more information and implementation details, visit our github repository: DUO
The model, which has a context length of 1024
and is similar in size to GPT2-medium with approximately 130 million
non-embedding parameters,
was trained for 1M steps on the OpenWebText corpus.
For more details, please see our paper: The Diffusion Duality.
Please cite our work using the bibtex below:
BibTeX:
@inproceedings{
sahoo2025the,
title={The Diffusion Duality},
author={Subham Sekhar Sahoo and Justin Deschenaux and Aaron Gokaslan and Guanghan Wang and Justin T Chiu and Volodymyr Kuleshov},
booktitle={ICLR 2025 Workshop on Deep Generative Model in Machine Learning: Theory, Principle and Efficacy},
year={2025},
url={https://openreview.net/forum?id=CB0Ub2yXjC}
}
Subham Sekhar Sahoo ([email protected])