Update src/about.py
Browse files- src/about.py +3 -0
src/about.py
CHANGED
@@ -69,8 +69,11 @@ LLM_BENCHMARKS_TEXT = f"""
|
|
69 |
## Metrics
|
70 |
π Our evaluation metrics include, but are not limited to, Accuracy, F1 Score, ROUGE score, BERTScore, and Matthews correlation coefficient (MCC), providing a multidimensional assessment of model performance.
|
71 |
Metrics for specific tasks are as follows:
|
|
|
72 |
FPB-F1
|
|
|
73 |
FiQA-SA-F1
|
|
|
74 |
TSA-RMSE
|
75 |
Headlines-AvgF1
|
76 |
FOMC-F1
|
|
|
69 |
## Metrics
|
70 |
π Our evaluation metrics include, but are not limited to, Accuracy, F1 Score, ROUGE score, BERTScore, and Matthews correlation coefficient (MCC), providing a multidimensional assessment of model performance.
|
71 |
Metrics for specific tasks are as follows:
|
72 |
+
|
73 |
FPB-F1
|
74 |
+
|
75 |
FiQA-SA-F1
|
76 |
+
|
77 |
TSA-RMSE
|
78 |
Headlines-AvgF1
|
79 |
FOMC-F1
|