metadata
base_model:
- appvoid/palmer-002-32k
- raidhon/coven_tiny_1.1b_32k_orpo_alpha
- appvoid/palmer-003
library_name: transformers
tags:
- mergekit
- merge
license: apache-2.0
june update
This model has improved overall performance at the expense of small degradation on winogrande. As all palmer models, the model is biased to respond to answers without using any specific prompt, feel free to further fine-tune it for your specific use case.
Model | MMLU | ARC-C | HellaSwag | PIQA | Winogrande | Average |
---|---|---|---|---|---|---|
tinyllama-3t | 0.2577 | 0.3029 | 0.5935 | 0.7329 | 0.5959 | 0.4966 |
palmer-004-old | 0.2601 | 0.3456 | 0.6138 | 0.7443 | 0.6511 | 0.5229 |
palmer-004 | 0.2661 | 0.3490 | 0.6173 | 0.7481 | 0.6417 | 0.5244 |
note
Even though palmer-003 is only 2k context size, it's 0.5257 on average so if you don't need that much context size (32k) you are better off with the former one.