prithivMLmods commited on
Commit
7c26608
·
verified ·
1 Parent(s): 9dd336a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -2
README.md CHANGED
@@ -8,6 +8,9 @@ pipeline_tag: text-generation
8
  library_name: transformers
9
  tags:
10
  - reason
 
 
 
11
  ---
12
 
13
  ![logo.png](https://cdn-uploads.huggingface.co/production/uploads/65bb837dbfb878f46c77de4c/Rqm-Qx8AvbHFFbFbVY93X.png)
@@ -66,5 +69,4 @@ Despite its capabilities, Bellatrix has some limitations:
66
  2. **Dependence on Training Data**: It is only as good as the quality and diversity of its training data, which may lead to biases or inaccuracies.
67
  3. **Computational Resources**: The model’s optimized transformer architecture can be resource-intensive, requiring significant computational power for fine-tuning and inference.
68
  4. **Language Coverage**: While multilingual, some languages or dialects may have limited support or lower performance compared to widely used ones.
69
- 5. **Real-World Contexts**: It may struggle with understanding nuanced or ambiguous real-world scenarios not covered during training.
70
-
 
8
  library_name: transformers
9
  tags:
10
  - reason
11
+ datasets:
12
+ - Magpie-Align/Magpie-Reasoning-V2-250K-CoT-QwQ
13
+ - ngxson/MiniThinky-dataset
14
  ---
15
 
16
  ![logo.png](https://cdn-uploads.huggingface.co/production/uploads/65bb837dbfb878f46c77de4c/Rqm-Qx8AvbHFFbFbVY93X.png)
 
69
  2. **Dependence on Training Data**: It is only as good as the quality and diversity of its training data, which may lead to biases or inaccuracies.
70
  3. **Computational Resources**: The model’s optimized transformer architecture can be resource-intensive, requiring significant computational power for fine-tuning and inference.
71
  4. **Language Coverage**: While multilingual, some languages or dialects may have limited support or lower performance compared to widely used ones.
72
+ 5. **Real-World Contexts**: It may struggle with understanding nuanced or ambiguous real-world scenarios not covered during training.