Update README.md
Browse files
README.md
CHANGED
@@ -1,6 +1,18 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
|
|
|
2 |
|
3 |
-
|
|
|
|
|
|
|
|
|
4 |
Pretrained weights for [CodeBERT: A Pre-Trained Model for Programming and Natural Languages](https://arxiv.org/abs/2002.08155).
|
5 |
|
6 |
### Training Data
|
@@ -45,4 +57,4 @@ Expected results:
|
|
45 |
archivePrefix={arXiv},
|
46 |
primaryClass={cs.CL}
|
47 |
}
|
48 |
-
```
|
|
|
1 |
+
---
|
2 |
+
license: mit
|
3 |
+
language:
|
4 |
+
- code
|
5 |
+
datasets:
|
6 |
+
- code_search_net
|
7 |
+
---
|
8 |
|
9 |
+
This is an *unofficial* reupload of [microsoft/codebert-base-mlm](https://huggingface.co/microsoft/codebert-base-mlm) in the `SafeTensors` format using `transformers` `4.40.1`. The goal of this reupload is to prevent older models that are still relevant baselines from becoming stale as a result of changes in HuggingFace. Additionally, I may include minor corrections, such as model max length configuration.
|
10 |
|
11 |
+
Original model card below:
|
12 |
+
|
13 |
+
---
|
14 |
+
|
15 |
+
## CodeBERT-base-mlm
|
16 |
Pretrained weights for [CodeBERT: A Pre-Trained Model for Programming and Natural Languages](https://arxiv.org/abs/2002.08155).
|
17 |
|
18 |
### Training Data
|
|
|
57 |
archivePrefix={arXiv},
|
58 |
primaryClass={cs.CL}
|
59 |
}
|
60 |
+
```
|