Update src/about.py
Browse files- src/about.py +1 -0
src/about.py
CHANGED
@@ -68,6 +68,7 @@ LLM_BENCHMARKS_TEXT = f"""
|
|
68 |
|
69 |
## Metrics
|
70 |
π Our evaluation metrics include, but are not limited to, Accuracy, F1 Score, ROUGE score, BERTScore, and Matthews correlation coefficient (MCC), providing a multidimensional assessment of model performance.
|
|
|
71 |
Metrics for specific tasks are as follows:
|
72 |
|
73 |
FPB-F1
|
|
|
68 |
|
69 |
## Metrics
|
70 |
π Our evaluation metrics include, but are not limited to, Accuracy, F1 Score, ROUGE score, BERTScore, and Matthews correlation coefficient (MCC), providing a multidimensional assessment of model performance.
|
71 |
+
|
72 |
Metrics for specific tasks are as follows:
|
73 |
|
74 |
FPB-F1
|