Datasets:
Update README.md
Browse files
README.md
CHANGED
@@ -43,21 +43,27 @@ size_categories:
|
|
43 |
---
|
44 |
|
45 |
# SpokenSwag
|
46 |
-
We present here SpokenSwag as described in the paper ["_Slamming_: Training a Speech Language Model on One GPU in a Day"](
|
47 |
This dataset is based on [allenai/swag](https://huggingface.co/datasets/allenai/swag) and synthetised with 4 speakers from [hexgrad/Kokoro-82M](https://huggingface.co/hexgrad/Kokoro-82M).
|
48 |
We show that perfoming DPO over the dataset can really improve performance of Speech Language Models.
|
49 |
We encourage you to also see the following resources, for further information:
|
50 |
|
51 |
**Project Page:** https://pages.cs.huji.ac.il/adiyoss-lab/slamming/ \
|
52 |
-
**Paper:**
|
53 |
**Code:** https://github.com/slp-rl/slamkit
|
54 |
|
55 |
|
56 |
If you use our dataset, please cite the paper as follows:
|
57 |
```
|
58 |
-
@
|
59 |
-
|
60 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
61 |
|
62 |
```
|
63 |
|
|
|
43 |
---
|
44 |
|
45 |
# SpokenSwag
|
46 |
+
We present here SpokenSwag as described in the paper ["_Slamming_: Training a Speech Language Model on One GPU in a Day"](https://arxiv.org/abs/2502.15814).
|
47 |
This dataset is based on [allenai/swag](https://huggingface.co/datasets/allenai/swag) and synthetised with 4 speakers from [hexgrad/Kokoro-82M](https://huggingface.co/hexgrad/Kokoro-82M).
|
48 |
We show that perfoming DPO over the dataset can really improve performance of Speech Language Models.
|
49 |
We encourage you to also see the following resources, for further information:
|
50 |
|
51 |
**Project Page:** https://pages.cs.huji.ac.il/adiyoss-lab/slamming/ \
|
52 |
+
**Paper:** https://arxiv.org/abs/2502.15814 \
|
53 |
**Code:** https://github.com/slp-rl/slamkit
|
54 |
|
55 |
|
56 |
If you use our dataset, please cite the paper as follows:
|
57 |
```
|
58 |
+
@misc{maimon2025slamming,
|
59 |
+
title={Slamming: Training a Speech Language Model on One GPU in a Day},
|
60 |
+
author={Gallil Maimon and Avishai Elmakies and Yossi Adi},
|
61 |
+
year={2025},
|
62 |
+
eprint={2502.15814},
|
63 |
+
archivePrefix={arXiv},
|
64 |
+
primaryClass={cs.LG},
|
65 |
+
url={https://arxiv.org/abs/2502.15814},
|
66 |
+
}
|
67 |
|
68 |
```
|
69 |
|