burakaytan
commited on
Commit
•
a7b55dc
1
Parent(s):
1789c36
Update README.md
Browse files
README.md
CHANGED
@@ -6,6 +6,8 @@ This is a Turkish RoBERTa base model pretrained on Turkish Wikipedia, Turkish OS
|
|
6 |
|
7 |
The final training corpus has a size of 38 GB and 329.720.508 sentences.
|
8 |
|
|
|
|
|
9 |
# Usage
|
10 |
Load transformers library with:
|
11 |
```
|
|
|
6 |
|
7 |
The final training corpus has a size of 38 GB and 329.720.508 sentences.
|
8 |
|
9 |
+
Thanks to Turkcell we could train the model on Intel(R) Xeon(R) Gold 6230R CPU @ 2.10GHz 256GB RAM 2 x GV100GL [Tesla V100 PCIe 32GB] GPU
|
10 |
+
|
11 |
# Usage
|
12 |
Load transformers library with:
|
13 |
```
|