rushabhGod
commited on
End of training
Browse files- README.md +8 -8
- logs/events.out.tfevents.1737084820.PC2031.22320.1 +3 -0
- tokenizer.json +0 -0
- tokenizer_config.json +2 -1
README.md
CHANGED
@@ -16,14 +16,14 @@ should probably proofread and complete it, then remove this comment. -->
|
|
16 |
|
17 |
This model is a fine-tuned version of [SCUT-DLVCLab/lilt-roberta-en-base](https://huggingface.co/SCUT-DLVCLab/lilt-roberta-en-base) on an unknown dataset.
|
18 |
It achieves the following results on the evaluation set:
|
19 |
-
- Loss: 1.
|
20 |
-
- Answer: {'precision': 0.
|
21 |
-
- Header: {'precision': 0.
|
22 |
-
- Question: {'precision': 0.
|
23 |
-
- Overall Precision: 0.
|
24 |
-
- Overall Recall: 0.
|
25 |
-
- Overall F1: 0.
|
26 |
-
- Overall Accuracy: 0.
|
27 |
|
28 |
## Model description
|
29 |
|
|
|
16 |
|
17 |
This model is a fine-tuned version of [SCUT-DLVCLab/lilt-roberta-en-base](https://huggingface.co/SCUT-DLVCLab/lilt-roberta-en-base) on an unknown dataset.
|
18 |
It achieves the following results on the evaluation set:
|
19 |
+
- Loss: 1.5924
|
20 |
+
- Answer: {'precision': 0.8748538011695907, 'recall': 0.9155446756425949, 'f1': 0.8947368421052633, 'number': 817}
|
21 |
+
- Header: {'precision': 0.64, 'recall': 0.5378151260504201, 'f1': 0.5844748858447488, 'number': 119}
|
22 |
+
- Question: {'precision': 0.8945487042001787, 'recall': 0.9294336118848654, 'f1': 0.9116575591985429, 'number': 1077}
|
23 |
+
- Overall Precision: 0.8742
|
24 |
+
- Overall Recall: 0.9006
|
25 |
+
- Overall F1: 0.8872
|
26 |
+
- Overall Accuracy: 0.8193
|
27 |
|
28 |
## Model description
|
29 |
|
logs/events.out.tfevents.1737084820.PC2031.22320.1
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ad1e8270d0dbc37d0f751dbdedb420d48af48793b24f01d88d5b8b13dbc7b19c
|
3 |
+
size 592
|
tokenizer.json
CHANGED
The diff for this file is too large to render.
See raw diff
|
|
tokenizer_config.json
CHANGED
@@ -43,7 +43,7 @@
|
|
43 |
}
|
44 |
},
|
45 |
"bos_token": "<s>",
|
46 |
-
"clean_up_tokenization_spaces":
|
47 |
"cls_token": "<s>",
|
48 |
"cls_token_box": [
|
49 |
0,
|
@@ -53,6 +53,7 @@
|
|
53 |
],
|
54 |
"eos_token": "</s>",
|
55 |
"errors": "replace",
|
|
|
56 |
"mask_token": "<mask>",
|
57 |
"model_max_length": 512,
|
58 |
"only_label_first_subword": true,
|
|
|
43 |
}
|
44 |
},
|
45 |
"bos_token": "<s>",
|
46 |
+
"clean_up_tokenization_spaces": false,
|
47 |
"cls_token": "<s>",
|
48 |
"cls_token_box": [
|
49 |
0,
|
|
|
53 |
],
|
54 |
"eos_token": "</s>",
|
55 |
"errors": "replace",
|
56 |
+
"extra_special_tokens": {},
|
57 |
"mask_token": "<mask>",
|
58 |
"model_max_length": 512,
|
59 |
"only_label_first_subword": true,
|