Datasets:
Tasks:
Text Classification
Modalities:
Text
Formats:
arrow
Sub-tasks:
multi-label-classification
Languages:
English
Size:
1K - 10K
License:
ilos-vigil
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -52,18 +52,55 @@ There are 8 aspects to define review in this dataset. I am the only annotator fo
|
|
52 |
|
53 |
Take note that few reviews contain language and content that some people may find offensive, discriminatory, or inappropriate. I **DO NOT** endorse, condone or promote any of such language and content.
|
54 |
|
55 |
-
|
56 |
|
57 |
-
|
58 |
|
59 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
60 |
|
61 |
You can download Steam review aspect dataset from here (HuggingFace) or one of these sources,
|
62 |
|
63 |
* [GitHub](https://github.com/ilos-vigil/steam-review-aspect-dataset)
|
64 |
* [Kaggle](https://www.kaggle.com/datasets/ilosvigil/steam-review-aspect-dataset)
|
65 |
|
66 |
-
|
67 |
|
68 |
If you wish to use this dataset in your research or project, please cite this blog post: [Steam review aspect dataset](https://srec.ai/blog/steam-review-aspect-dataset)
|
69 |
|
@@ -85,6 +122,6 @@ For those who need it, a BibTeX citation format also has been prepared.
|
|
85 |
}
|
86 |
```
|
87 |
|
88 |
-
|
89 |
|
90 |
Steam Review aspect dataset is licensed under [Creative Commons Attribution 4.0 International](https://creativecommons.org/licenses/by/4.0).
|
|
|
52 |
|
53 |
Take note that few reviews contain language and content that some people may find offensive, discriminatory, or inappropriate. I **DO NOT** endorse, condone or promote any of such language and content.
|
54 |
|
55 |
+
# Model benchmark
|
56 |
|
57 |
+
Model benchmark on Steam Review aspect dataset split into 3 categories,
|
58 |
|
59 |
+
* Base: Non-attention based language model.
|
60 |
+
* Embedding: Inspired by MTEB, obtained embedding trained on Logistic Regressor for up to 100 epochs.
|
61 |
+
* Fine-tune.
|
62 |
+
|
63 |
+
Source code for running these models is available on [GitHub](https://github.com/ilos-vigil/steam-review-aspect-dataset/tree/main/model_benchmark). But take note it may not follow best practice as it was written with the goal of using it only once. I ran those on Linux, RTX 3060 and 32GB RAM.
|
64 |
+
|
65 |
+
> Base
|
66 |
+
|
67 |
+
| Model | Macro precision | Macro recall | Macro F1 | Note |
|
68 |
+
| ------------------ | --------------- | ------------ | -------- | ---------------------------------------------------------------------------- |
|
69 |
+
| Spacy Bag of Words | 0.6203 | 0.5391 | 0.5494 | |
|
70 |
+
| FastText | 0.6284 | 0.5713 | 0.5871 | Minimum text preprocessing, use pretrained vector |
|
71 |
+
| FastText | 0.6933 | 0.5821 | 0.6027 | Minimum text preprocessing, choose hyperparameter based on K-5 fold autotune |
|
72 |
+
| Spacy Ensemble | 0.6043 | 0.6773 | 0.6299 | Choose hyperparameter based on simple grid search |
|
73 |
+
|
74 |
+
> Embedding
|
75 |
+
|
76 |
+
| Model | Param | Max tokens | Macro precision | Macro recall | Macro F1 | Note |
|
77 |
+
| --------------------------------------------------------- | ----- | ---------- | --------------- | ------------ | -------- | ------------------------------------ |
|
78 |
+
| sentence-transformers/all-mpnet-base-v2 | 110M | 514 | 0.7074 | 0.5431 | 0.5853 | |
|
79 |
+
| jinaai/jina-embeddings-v2-small-en | 137M | 8192 | 0.7068 | 0.6075 | 0.6437 | |
|
80 |
+
| jinaai/jina-embeddings-v2-base-en | 137M | 8192 | 0.6813 | 0.6501 | 0.6618 | |
|
81 |
+
| Alibaba-NLP/gte-large-en-v1.5 | 434M | 8192 | 0.7001 | 0.6501 | 0.6729 | |
|
82 |
+
| nomic-ai/nomic-embed-text-v1.5 | 137M | 8192 | 0.7075 | 0.6498 | 0.6756 | |
|
83 |
+
| McGill-NLP/LLM2Vec-Mistral-7B-Instruct-v2-mntp-supervised | 7111M | 32768 | 0.7238 | 0.6697 | 0.6928 | NF4 double quantization, instruction |
|
84 |
+
| WhereIsAI/UAE-Large-V1 | 335M | 512 | 0.7245 | 0.6718 | 0.6946 | |
|
85 |
+
| mixedbread-ai/mxbai-embed-large-v1 | 335M | 512 | 0.7215 | 0.6817 | 0.6989 | |
|
86 |
+
| intfloat/e5-mistral-7b-instruct | 7111M | 32768 | 0.7345 | 0.7000 | 0.7137 | NF4 double quantization, instruction |
|
87 |
+
|
88 |
+
> Fine-tune
|
89 |
+
|
90 |
+
| Model | Param | Max tokens | Macro precision | Macro recall | Macro F1 | Note |
|
91 |
+
| --------------------------------- | ----- | ---------- | --------------- | ------------ | -------- | ----------------------------------------------- |
|
92 |
+
| jinaai/jina-embeddings-v2-base-en | 137M | 8192 | 0.7485 | 0.7257 | 0.7354 | Choose hyperparameter from Ray Tune (30 trials) |
|
93 |
+
| Alibaba-NLP/gte-large-en-v1.5 | 434M | 8192 | 0.8403 | 0.8152 | 0.8231 | Choose hyperparameter from Ray Tune (16 trials) |
|
94 |
+
|
95 |
+
|
96 |
+
# Download
|
97 |
|
98 |
You can download Steam review aspect dataset from here (HuggingFace) or one of these sources,
|
99 |
|
100 |
* [GitHub](https://github.com/ilos-vigil/steam-review-aspect-dataset)
|
101 |
* [Kaggle](https://www.kaggle.com/datasets/ilosvigil/steam-review-aspect-dataset)
|
102 |
|
103 |
+
# Citation
|
104 |
|
105 |
If you wish to use this dataset in your research or project, please cite this blog post: [Steam review aspect dataset](https://srec.ai/blog/steam-review-aspect-dataset)
|
106 |
|
|
|
122 |
}
|
123 |
```
|
124 |
|
125 |
+
# License
|
126 |
|
127 |
Steam Review aspect dataset is licensed under [Creative Commons Attribution 4.0 International](https://creativecommons.org/licenses/by/4.0).
|