Add link to paper
Browse filesThis PR ensures the model can be viewed from https://huggingface.co/papers/2012.02110
README.md
CHANGED
|
@@ -17,7 +17,7 @@ GottBERT is the first German-only RoBERTa model, pre-trained on the German porti
|
|
| 17 |
- **Large Model**: 24 layers, 355 million parameters
|
| 18 |
- **License**: MIT
|
| 19 |
|
| 20 |
-
|
| 21 |
|
| 22 |
## Pretraining Details
|
| 23 |
|
|
|
|
| 17 |
- **Large Model**: 24 layers, 355 million parameters
|
| 18 |
- **License**: MIT
|
| 19 |
|
| 20 |
+
This was presented in [GottBERT: a pure German Language Model](https://huggingface.co/papers/2012.02110).
|
| 21 |
|
| 22 |
## Pretraining Details
|
| 23 |
|