eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
3 values
Architecture
stringclasses
59 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
1.03
52
Hub License
stringclasses
25 values
Hub ❤️
int64
0
5.96k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.03
107
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.27
0.75
BBH
float64
0.81
63.5
MATH Lvl 5 Raw
float64
0
0.51
MATH Lvl 5
float64
0
50.7
GPQA Raw
float64
0.22
0.44
GPQA
float64
0
24.9
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.5
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
457 values
Submission Date
stringclasses
200 values
Generation
int64
0
10
Base Model
stringlengths
4
102
someon98_qwen-CoMa-0.5b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/someon98/qwen-CoMa-0.5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">someon98/qwen-CoMa-0.5b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/someon98__qwen-CoMa-0.5b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
someon98/qwen-CoMa-0.5b
67336cfb494c0aa1995be0efdeeb9fb0c6a386fe
5.782531
0
0.63
false
false
false
false
0.509179
0.227664
22.766371
0.295334
2.126794
0
0
0.239933
0
0.404573
8.704948
0.109874
1.097074
false
false
2024-12-29
2024-12-29
1
someon98/qwen-CoMa-0.5b (Merge)
sometimesanotion_IF-reasoning-experiment-40_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/IF-reasoning-experiment-40" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/IF-reasoning-experiment-40</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__IF-reasoning-experiment-40-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/IF-reasoning-experiment-40
0064fffb67d18b0f946b6e7bf3227ca0c92af3eb
37.207181
0
7.383
false
false
false
false
1.904959
0.632979
63.297938
0.611186
44.306408
0.27719
27.719033
0.380034
17.337808
0.519417
25.860417
0.502493
44.721483
false
false
2024-12-29
0
Removed
sometimesanotion_IF-reasoning-experiment-80_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/IF-reasoning-experiment-80" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/IF-reasoning-experiment-80</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__IF-reasoning-experiment-80-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/IF-reasoning-experiment-80
d1441e8bd87f11235fd4c708f6ece69a9973c343
21.77674
0
7.383
false
false
false
false
1.886982
0.546276
54.62761
0.421038
17.48234
0.046828
4.682779
0.284396
4.58613
0.502458
22.973958
0.336769
26.307624
false
false
2024-12-29
0
Removed
sometimesanotion_Lamarck-14B-v0.1-experimental_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Lamarck-14B-v0.1-experimental" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Lamarck-14B-v0.1-experimental</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Lamarck-14B-v0.1-experimental-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Lamarck-14B-v0.1-experimental
b0600e08e8c97b25d1abca543b997d9927245442
36.670996
0
14.766
false
false
false
false
1.894693
0.535385
53.5385
0.658254
50.794908
0.305136
30.513595
0.381711
17.561521
0.472844
18.638802
0.540808
48.97865
false
false
2024-12-09
0
Removed
sometimesanotion_Lamarck-14B-v0.3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Lamarck-14B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Lamarck-14B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Lamarck-14B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Lamarck-14B-v0.3
781637d1b65766fe933ebde070632e48f91390ab
36.576096
apache-2.0
2
14.766
true
false
false
false
7.639181
0.503162
50.316161
0.66114
51.274309
0.324018
32.401813
0.388423
18.456376
0.468813
18.001563
0.541057
49.006353
true
false
2024-12-06
2024-12-09
1
sometimesanotion/Lamarck-14B-v0.3 (Merge)
sometimesanotion_Lamarck-14B-v0.4-Qwenvergence_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Lamarck-14B-v0.4-Qwenvergence" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Lamarck-14B-v0.4-Qwenvergence</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Lamarck-14B-v0.4-Qwenvergence-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Lamarck-14B-v0.4-Qwenvergence
add9a151dd5614603bebcf3d3740fa92e5d67632
36.569794
0
14.766
false
false
false
false
1.741263
0.490647
49.064704
0.653514
50.208045
0.336858
33.685801
0.378356
17.114094
0.484688
20.385937
0.540642
48.96018
false
false
2024-12-12
0
Removed
sometimesanotion_Lamarck-14B-v0.6_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Lamarck-14B-v0.6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Lamarck-14B-v0.6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Lamarck-14B-v0.6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Lamarck-14B-v0.6
e9c144208c045fe6954ef3f658a3bda38dbd0d82
40.374393
apache-2.0
7
14.766
true
false
false
false
1.922385
0.697251
69.725107
0.646031
49.297895
0.356495
35.649547
0.389262
18.568233
0.484688
20.119271
0.539977
48.886303
true
false
2025-01-04
2025-01-05
1
sometimesanotion/Lamarck-14B-v0.6 (Merge)
sometimesanotion_Lamarck-14B-v0.6-002-model_stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Lamarck-14B-v0.6-002-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Lamarck-14B-v0.6-002-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Lamarck-14B-v0.6-002-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Lamarck-14B-v0.6-002-model_stock
c2d5adb04b1839aeeca77a3f2a5be08864116da1
38.85335
0
7.383
false
false
false
false
1.887926
0.669224
66.922432
0.614335
45.006584
0.34139
34.138973
0.374161
16.55481
0.518021
25.452604
0.505402
45.044696
false
false
2025-01-01
0
Removed
sometimesanotion_Lamarck-14B-v0.6-model_stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Lamarck-14B-v0.6-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Lamarck-14B-v0.6-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Lamarck-14B-v0.6-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Lamarck-14B-v0.6-model_stock
4d4227285a889ffd23618ad32ff7b08d1bcfa5ae
39.580916
0
7.383
false
false
false
false
1.861639
0.678966
67.896625
0.626944
46.491326
0.358761
35.876133
0.384228
17.897092
0.500656
22.682031
0.519781
46.642287
false
false
2024-12-31
0
Removed
sometimesanotion_Qwen-14B-ProseStock-v4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwen-14B-ProseStock-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwen-14B-ProseStock-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwen-14B-ProseStock-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwen-14B-ProseStock-v4
7bbd108559500c0efca1f8925180bb1771425559
37.225402
0
7.383
false
false
false
false
1.844482
0.494219
49.421867
0.649827
49.5413
0.354985
35.498489
0.388423
18.456376
0.493833
21.695833
0.538647
48.738549
false
false
2024-12-24
0
Removed
sometimesanotion_Qwen-2.5-14B-Virmarckeoso_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwen-2.5-14B-Virmarckeoso" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwen-2.5-14B-Virmarckeoso</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwen-2.5-14B-Virmarckeoso-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwen-2.5-14B-Virmarckeoso
36.245933
0
14.766
false
false
false
false
2.395155
0.48133
48.132954
0.656973
50.652295
0.333082
33.308157
0.379195
17.225951
0.479354
19.519271
0.537733
48.636968
false
false
2024-12-10
0
Removed
sometimesanotion_Qwen2.5-14B-Vimarckoso_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwen2.5-14B-Vimarckoso" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwen2.5-14B-Vimarckoso</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwen2.5-14B-Vimarckoso-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwen2.5-14B-Vimarckoso
0865365f6c0b221c08fdd5adf8965f3720645226
35.905884
0
14.766
false
false
false
false
1.567659
0.457424
45.742408
0.644635
49.178956
0.329305
32.930514
0.392617
19.01566
0.485865
20.466406
0.532912
48.101359
false
false
2024-12-11
0
Removed
sometimesanotion_Qwen2.5-14B-Vimarckoso-v2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwen2.5-14B-Vimarckoso-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwen2.5-14B-Vimarckoso-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwen2.5-14B-Vimarckoso-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwen2.5-14B-Vimarckoso-v2
5768a4448e4e3a95a7f459ac2b106abbf8510840
36.059942
0
7.383
false
false
false
false
1.582541
0.45053
45.053015
0.655034
50.419625
0.350453
35.045317
0.38255
17.673378
0.481896
19.503646
0.537982
48.664672
false
false
2024-12-26
0
Removed
sometimesanotion_Qwen2.5-14B-Vimarckoso-v3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwen2.5-14B-Vimarckoso-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwen2.5-14B-Vimarckoso-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwen2.5-14B-Vimarckoso-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwen2.5-14B-Vimarckoso-v3
e2f4b596010057af0cd8f27ba992bf9d6af48801
40.095001
apache-2.0
10
14.766
true
false
false
false
1.928625
0.725652
72.565238
0.64146
48.581587
0.344411
34.441088
0.380034
17.337808
0.480688
19.385937
0.534325
48.258348
true
false
2024-12-27
2024-12-27
1
sometimesanotion/Qwen2.5-14B-Vimarckoso-v3 (Merge)
sometimesanotion_Qwen2.5-14B-Vimarckoso-v3-IF-Variant_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwen2.5-14B-Vimarckoso-v3-IF-Variant" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwen2.5-14B-Vimarckoso-v3-IF-Variant</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwen2.5-14B-Vimarckoso-v3-IF-Variant-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwen2.5-14B-Vimarckoso-v3-IF-Variant
246b592926a9351b195650b5bcfe1cba9218a698
33.720033
0
7.383
false
false
false
false
1.967376
0.641297
64.129731
0.552079
35.653097
0.212991
21.299094
0.347315
12.975391
0.531917
28.389583
0.45886
39.873301
false
false
2024-12-28
0
Removed
sometimesanotion_Qwen2.5-14B-Vimarckoso-v3-Prose01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwen2.5-14B-Vimarckoso-v3-Prose01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwen2.5-14B-Vimarckoso-v3-Prose01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwen2.5-14B-Vimarckoso-v3-Prose01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwen2.5-14B-Vimarckoso-v3-Prose01
3c65fc0b2ffb89149b4c6e984414d3a13000fd7c
39.460942
0
7.383
false
false
false
false
1.893958
0.687234
68.723432
0.635877
47.706625
0.350453
35.045317
0.386745
18.232662
0.480719
19.55651
0.52751
47.501108
false
false
2024-12-30
0
Removed
sometimesanotion_Qwen2.5-14B-Vimarckoso-v3-model_stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwen2.5-14B-Vimarckoso-v3-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwen2.5-14B-Vimarckoso-v3-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwen2.5-14B-Vimarckoso-v3-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwen2.5-14B-Vimarckoso-v3-model_stock
06ec138247d03a9308c886c8b326f210c18117e4
39.814975
0
7.383
false
false
false
false
1.884952
0.716185
71.618528
0.642092
48.761006
0.339879
33.987915
0.380034
17.337808
0.478115
19.23099
0.531582
47.953605
false
false
2024-12-27
0
Removed
sometimesanotion_Qwentinuum-14B-v013_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwentinuum-14B-v013" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwentinuum-14B-v013</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwentinuum-14B-v013-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwentinuum-14B-v013
2e5ad6d32e76852a803b976078ac0ac2ff0aaaac
37.956304
0
7.383
false
false
false
false
1.923689
0.671123
67.112262
0.608663
43.965235
0.33006
33.006042
0.357383
14.317673
0.515417
24.99375
0.499086
44.342863
false
false
2024-12-26
0
Removed
sometimesanotion_Qwentinuum-14B-v1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwentinuum-14B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwentinuum-14B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwentinuum-14B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwentinuum-14B-v1
cd71c7c9f4e18deed1fe8000ae4784b96c33281f
36.547079
0
7.383
false
false
false
false
1.91815
0.503162
50.316161
0.657257
50.737494
0.324018
32.401813
0.38255
17.673378
0.478052
19.15651
0.540974
48.997119
false
false
2024-12-21
0
Removed
sometimesanotion_Qwentinuum-14B-v2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwentinuum-14B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwentinuum-14B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwentinuum-14B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwentinuum-14B-v2
70f5b77f646b5f4cc6f7decf7bd3c7b3bd4cebcf
37.147883
0
7.383
false
false
false
false
2.003297
0.537833
53.783295
0.655536
50.53548
0.329305
32.930514
0.388423
18.456376
0.471417
18.19375
0.540891
48.987884
false
false
2024-12-21
0
Removed
sometimesanotion_Qwentinuum-14B-v3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwentinuum-14B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwentinuum-14B-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwentinuum-14B-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwentinuum-14B-v3
2331e2c1afe4e224c9c019f4f03c2ad19bd15465
38.743896
0
7.383
false
false
false
false
1.913008
0.615768
61.576838
0.653865
50.037611
0.32855
32.854985
0.387584
18.344519
0.48599
20.615365
0.541307
49.034057
false
false
2024-12-22
0
Removed
sometimesanotion_Qwentinuum-14B-v5_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwentinuum-14B-v5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwentinuum-14B-v5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwentinuum-14B-v5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwentinuum-14B-v5
8be868ce00f239bf06c859c0c40fcf4c54a9205c
38.872429
0
7.383
false
false
false
false
1.883788
0.628558
62.855778
0.654985
50.283974
0.31571
31.570997
0.387584
18.344519
0.487385
21.089844
0.541805
49.089465
false
false
2024-12-22
0
Removed
sometimesanotion_Qwentinuum-14B-v6_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwentinuum-14B-v6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwentinuum-14B-v6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwentinuum-14B-v6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwentinuum-14B-v6
951576b4056fe63d02cdc31a653585d9706beba9
39.234413
0
7.383
false
false
false
false
1.903968
0.630406
63.040621
0.654452
50.23191
0.338369
33.836858
0.386745
18.232662
0.489958
21.178125
0.539977
48.886303
false
false
2024-12-22
0
Removed
sometimesanotion_Qwentinuum-14B-v6-Prose_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwentinuum-14B-v6-Prose" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwentinuum-14B-v6-Prose</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwentinuum-14B-v6-Prose-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwentinuum-14B-v6-Prose
fc6086a7732bc8e87505f4c2bc49561a52ad04a9
38.45728
0
7.383
false
false
false
false
1.963535
0.564286
56.428609
0.654511
50.140601
0.35574
35.574018
0.388423
18.456376
0.49126
21.340885
0.539229
48.803191
false
false
2024-12-26
0
Removed
sometimesanotion_Qwentinuum-14B-v7_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwentinuum-14B-v7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwentinuum-14B-v7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwentinuum-14B-v7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwentinuum-14B-v7
e9505b4931323752ebb0c901494c050835f0e4d8
38.760125
0
7.383
false
false
false
false
1.897562
0.610922
61.092235
0.655143
50.347065
0.333837
33.383686
0.39094
18.791946
0.48199
19.948698
0.540974
48.997119
false
false
2024-12-22
0
Removed
sometimesanotion_Qwentinuum-14B-v8_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwentinuum-14B-v8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwentinuum-14B-v8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwentinuum-14B-v8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwentinuum-14B-v8
a856d3095937fd39f829824d6c6d9950cf56dc1d
37.654114
0
7.383
false
false
false
false
1.977966
0.541155
54.115525
0.653426
50.11143
0.34139
34.138973
0.383389
17.785235
0.487323
20.748698
0.541223
49.024823
false
false
2024-12-22
0
Removed
sometimesanotion_Qwentinuum-14B-v9_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwentinuum-14B-v9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwentinuum-14B-v9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwentinuum-14B-v9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwentinuum-14B-v9
3109d6342d8740336dc83569def5b3d80abfac38
36.802034
0
7.383
false
false
false
false
1.902691
0.51073
51.073042
0.658026
50.801347
0.323263
32.326284
0.385906
18.120805
0.478115
19.364323
0.542138
49.126404
false
false
2024-12-22
0
Removed
sometimesanotion_Qwenvergence-14B-qv256_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-qv256" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-qv256</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-qv256-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwenvergence-14B-qv256
13e8b600da0b78b23481738858b7ed2d533ee6e5
39.352512
0
7.383
false
false
false
false
1.963642
0.700623
70.062324
0.631208
47.078218
0.343656
34.365559
0.378356
17.114094
0.492594
21.074219
0.517786
46.420656
false
false
2025-01-01
0
Removed
sometimesanotion_Qwenvergence-14B-v0.6-004-model_stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v0.6-004-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v0.6-004-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v0.6-004-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwenvergence-14B-v0.6-004-model_stock
1fa94759545d9b591bcbbe93a2c90f2a346f9580
39.00507
0
7.383
false
false
false
false
1.909751
0.685985
68.598541
0.624934
46.366654
0.313444
31.344411
0.383389
17.785235
0.503323
23.348698
0.519282
46.586879
false
false
2025-01-01
0
Removed
sometimesanotion_Qwenvergence-14B-v2-Prose_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v2-Prose" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v2-Prose</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v2-Prose-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwenvergence-14B-v2-Prose
503b367e07a8ed3ce532d03ea35d40d8f17d6e35
36.741066
0
14
false
false
false
false
1.684789
0.470488
47.04883
0.651883
49.933472
0.3429
34.29003
0.393456
19.127517
0.492594
21.474219
0.537151
48.572326
false
false
2024-12-15
0
Removed
sometimesanotion_Qwenvergence-14B-v3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwenvergence-14B-v3
40c489fd71724f2fa3f7154e4874c6d00700c6c0
37.164833
0
7.383
false
false
false
false
1.902815
0.504411
50.441052
0.654824
50.352688
0.348187
34.818731
0.384228
17.897092
0.488594
20.740885
0.538647
48.738549
false
false
2024-12-21
0
Removed
sometimesanotion_Qwenvergence-14B-v3-Prose_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v3-Prose" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v3-Prose</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v3-Prose-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwenvergence-14B-v3-Prose
15e4222295ef31aee17c2e5b6e7a31ffd21e3c7b
37.370809
0
14.766
false
false
false
false
1.711341
0.491771
49.177072
0.651291
49.798367
0.35574
35.574018
0.395134
19.35123
0.493896
21.770313
0.536985
48.553856
false
false
2024-12-21
0
Removed
sometimesanotion_Qwenvergence-14B-v3-Reason_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v3-Reason" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v3-Reason</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v3-Reason-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwenvergence-14B-v3-Reason
1e613b0e6bfdb08e7c21a3e6ba3b84e361cf8350
37.0468
0
7.383
false
false
false
false
1.895604
0.536684
53.668378
0.656128
50.694448
0.324018
32.401813
0.386745
18.232662
0.474021
18.452604
0.539478
48.830895
false
false
2024-12-21
0
Removed
sometimesanotion_Qwenvergence-14B-v3-Reason_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v3-Reason" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v3-Reason</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v3-Reason-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwenvergence-14B-v3-Reason
6acf3cbc9c36b19d66ac683f073e32a9bf86d56e
36.714048
0
7.383
false
false
false
false
1.926172
0.527816
52.781619
0.655744
50.635776
0.311934
31.193353
0.384228
17.897092
0.475417
18.927083
0.539644
48.849365
false
false
2024-12-21
0
Removed
sometimesanotion_Qwenvergence-14B-v6-Prose_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/Qwenvergence-14B-v6-Prose" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/Qwenvergence-14B-v6-Prose</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__Qwenvergence-14B-v6-Prose-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/Qwenvergence-14B-v6-Prose
bbb6b0900b630a3120d036d3434ca0fa508ed559
38.824966
0
7.383
false
false
false
false
1.935264
0.599007
59.90073
0.654375
50.119976
0.348943
34.89426
0.388423
18.456376
0.488656
21.015365
0.537068
48.563091
false
false
2024-12-26
0
Removed
sometimesanotion_lamarck-14b-prose-model_stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/lamarck-14b-prose-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/lamarck-14b-prose-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__lamarck-14b-prose-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/lamarck-14b-prose-model_stock
d71942f5b5471fca97914ea26a9f66bb5866693e
35.589858
0
14.766
false
false
false
false
1.557519
0.427649
42.764864
0.648762
49.383876
0.336103
33.610272
0.393456
19.127517
0.484594
20.274219
0.535406
48.378398
false
false
2024-12-09
0
Removed
sometimesanotion_lamarck-14b-reason-model_stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sometimesanotion/lamarck-14b-reason-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sometimesanotion/lamarck-14b-reason-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sometimesanotion__lamarck-14b-reason-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sometimesanotion/lamarck-14b-reason-model_stock
0f1d7f04b9219ffe3bc26aa3146380fba249d61a
36.256328
0
14.766
false
false
false
false
7.948399
0.496467
49.646715
0.65689
50.715404
0.31571
31.570997
0.384228
17.897092
0.474083
18.79375
0.540226
48.914007
false
false
2024-12-09
0
Removed
sonthenguyen_ft-unsloth-zephyr-sft-bnb-4bit-20241014-161415_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-161415" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-161415</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sonthenguyen__ft-unsloth-zephyr-sft-bnb-4bit-20241014-161415-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-161415
467eff1ac1c3395c130929bbe1f34a8194715e7c
8.826874
apache-2.0
0
7.723
true
false
false
true
1.627712
0.289338
28.933785
0.380418
12.789212
0.007553
0.755287
0.246644
0
0.386063
6.024479
0.140126
4.458481
false
false
2024-10-15
2024-10-16
1
unsloth/zephyr-sft-bnb-4bit
sonthenguyen_ft-unsloth-zephyr-sft-bnb-4bit-20241014-164205_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-164205" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-164205</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sonthenguyen__ft-unsloth-zephyr-sft-bnb-4bit-20241014-164205-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-164205
467eff1ac1c3395c130929bbe1f34a8194715e7c
12.818811
apache-2.0
0
7.723
true
false
false
true
1.588998
0.319938
31.993777
0.395862
16.710725
0.001511
0.151057
0.276007
3.467562
0.427177
12.097135
0.212434
12.492612
false
false
2024-10-15
2024-10-16
1
unsloth/zephyr-sft-bnb-4bit
sonthenguyen_ft-unsloth-zephyr-sft-bnb-4bit-20241014-170522_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-170522" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-170522</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sonthenguyen__ft-unsloth-zephyr-sft-bnb-4bit-20241014-170522-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sonthenguyen/ft-unsloth-zephyr-sft-bnb-4bit-20241014-170522
467eff1ac1c3395c130929bbe1f34a8194715e7c
13.437097
apache-2.0
0
7.723
true
false
false
true
1.614698
0.376441
37.644118
0.382837
14.138282
0.009819
0.981873
0.265101
2.013423
0.440417
14.11875
0.205535
11.726138
false
false
2024-10-15
2024-10-16
1
unsloth/zephyr-sft-bnb-4bit
sonthenguyen_zephyr-sft-bnb-4bit-DPO-mtbc-213steps_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbc-213steps" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbc-213steps</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sonthenguyen__zephyr-sft-bnb-4bit-DPO-mtbc-213steps-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbc-213steps
4ae2af48b6ac53f14e153b91309624100ae3d7c2
15.790852
apache-2.0
0
7.242
true
false
false
true
0.69881
0.427549
42.75489
0.419729
19.669907
0.021903
2.190332
0.261745
1.565996
0.408635
9.579427
0.270861
18.98456
false
false
2024-10-02
2024-10-03
0
sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbc-213steps
sonthenguyen_zephyr-sft-bnb-4bit-DPO-mtbo-180steps_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbo-180steps" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbo-180steps</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sonthenguyen__zephyr-sft-bnb-4bit-DPO-mtbo-180steps-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbo-180steps
0393baf362e29cf51867596fb64746b5edafa6ed
15.552012
apache-2.0
0
7.242
true
false
false
true
0.675685
0.408714
40.871443
0.432259
21.351403
0.020393
2.039275
0.276007
3.467562
0.38851
6.163802
0.274767
19.418587
false
false
2024-10-02
2024-10-03
0
sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbo-180steps
sonthenguyen_zephyr-sft-bnb-4bit-DPO-mtbr-180steps_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbr-180steps" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbr-180steps</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sonthenguyen__zephyr-sft-bnb-4bit-DPO-mtbr-180steps-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbr-180steps
c4ee848caf14649f9260166653d4cdb30bcfc52a
16.475407
apache-2.0
0
7.242
true
false
false
true
0.684225
0.403219
40.321901
0.430536
21.213568
0.024924
2.492447
0.280201
4.026846
0.42575
11.785417
0.27111
19.012264
false
false
2024-10-02
2024-10-03
0
sonthenguyen/zephyr-sft-bnb-4bit-DPO-mtbr-180steps
sophosympatheia_Midnight-Miqu-70B-v1.5_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sophosympatheia/Midnight-Miqu-70B-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sophosympatheia/Midnight-Miqu-70B-v1.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sophosympatheia__Midnight-Miqu-70B-v1.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sophosympatheia/Midnight-Miqu-70B-v1.5
f6062ca8ccba38ce91eef16f85138e279160b9b9
25.22232
other
170
68.977
true
false
false
true
6.452967
0.611847
61.184657
0.560623
38.541462
0.024169
2.416918
0.296141
6.152125
0.424417
11.652083
0.38248
31.386673
true
false
2024-03-11
2024-10-22
1
sophosympatheia/Midnight-Miqu-70B-v1.5 (Merge)
speakleash_Bielik-11B-v2_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/speakleash/Bielik-11B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">speakleash/Bielik-11B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/speakleash__Bielik-11B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
speakleash/Bielik-11B-v2
a620588280793e605d1e0b125fe2a663030206ab
15.91354
apache-2.0
37
11.169
true
false
false
false
0.918733
0.238105
23.81049
0.493084
27.817907
0.074018
7.401813
0.288591
5.145414
0.392448
7.55599
0.313747
23.749631
false
false
2024-08-26
2024-10-16
0
speakleash/Bielik-11B-v2
speakleash_Bielik-11B-v2.0-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/speakleash/Bielik-11B-v2.0-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">speakleash/Bielik-11B-v2.0-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/speakleash__Bielik-11B-v2.0-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
speakleash/Bielik-11B-v2.0-Instruct
e4721e2af1152bad2e077c34375911a28aa1b8dc
24.421993
apache-2.0
4
11.169
true
false
false
true
0.888425
0.525243
52.524302
0.536158
33.774676
0.10423
10.422961
0.317114
8.948546
0.446708
14.738542
0.335106
26.122931
false
false
2024-08-26
2024-10-16
1
speakleash/Bielik-11B-v2
speakleash_Bielik-11B-v2.1-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/speakleash/Bielik-11B-v2.1-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">speakleash/Bielik-11B-v2.1-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/speakleash__Bielik-11B-v2.1-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
speakleash/Bielik-11B-v2.1-Instruct
c91776047eb235f51238a9e42f80f19e3ed114e7
22.854264
apache-2.0
3
11.169
true
false
false
true
1.305623
0.508982
50.898172
0.553012
36.290053
0.006042
0.60423
0.337248
11.63311
0.418521
10.515104
0.344664
27.184914
false
false
2024-08-26
2024-10-16
1
speakleash/Bielik-11B-v2
speakleash_Bielik-11B-v2.2-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/speakleash/Bielik-11B-v2.2-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">speakleash/Bielik-11B-v2.2-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/speakleash__Bielik-11B-v2.2-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
speakleash/Bielik-11B-v2.2-Instruct
b5502dab61fcc5e087e72c8a120057dea78082ad
24.769308
apache-2.0
57
11.169
true
false
false
true
1.460925
0.555194
55.519355
0.559656
36.958041
0.075529
7.55287
0.331376
10.850112
0.417125
10.107292
0.348654
27.628177
false
false
2024-08-26
2024-10-16
1
speakleash/Bielik-11B-v2
speakleash_Bielik-11B-v2.3-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/speakleash/Bielik-11B-v2.3-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">speakleash/Bielik-11B-v2.3-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/speakleash__Bielik-11B-v2.3-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
speakleash/Bielik-11B-v2.3-Instruct
7494fdc4d648707ea12b908d40b0ae708989b329
26.191144
apache-2.0
33
11.169
true
false
false
true
0.906142
0.558291
55.829089
0.56627
38.062788
0.08006
8.006042
0.340604
12.080537
0.451823
16.011198
0.344415
27.15721
true
false
2024-08-30
2024-10-16
1
speakleash/Bielik-11B-v2.3-Instruct (Merge)
spmurrayzzz_Mistral-Syndicate-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/spmurrayzzz/Mistral-Syndicate-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">spmurrayzzz/Mistral-Syndicate-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/spmurrayzzz__Mistral-Syndicate-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
spmurrayzzz/Mistral-Syndicate-7B
c74379dd6055ef4a70339b105ea315cebec23d24
13.899525
apache-2.0
0
7.242
true
false
false
false
0.579859
0.249596
24.959552
0.424506
20.506252
0.02719
2.719033
0.276007
3.467562
0.438552
13.61901
0.263132
18.125739
false
false
2023-12-30
2024-06-27
1
mistralai/Mistral-7B-v0.1
spow12_ChatWaifu_12B_v2.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/spow12/ChatWaifu_12B_v2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">spow12/ChatWaifu_12B_v2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/spow12__ChatWaifu_12B_v2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
spow12/ChatWaifu_12B_v2.0
1fb38700b2e2a66d4ff32636817df76285cea5f1
21.778576
cc-by-nc-4.0
18
12.248
true
false
false
true
3.426215
0.476758
47.675833
0.520768
31.16524
0.058912
5.891239
0.276846
3.579418
0.443177
15.830469
0.338763
26.529255
true
false
2024-10-10
2024-10-14
1
spow12/ChatWaifu_12B_v2.0 (Merge)
spow12_ChatWaifu_22B_v2.0_preview_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/spow12/ChatWaifu_22B_v2.0_preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">spow12/ChatWaifu_22B_v2.0_preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/spow12__ChatWaifu_22B_v2.0_preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
spow12/ChatWaifu_22B_v2.0_preview
36af7ec06bc85405e8641986ad45c6d21353b114
29.4075
cc-by-nc-4.0
6
22.247
true
false
false
true
1.494204
0.674495
67.449478
0.617015
45.488294
0.180514
18.05136
0.315436
8.724832
0.368542
3.534375
0.39877
33.196661
true
false
2024-09-23
2024-09-24
1
spow12/ChatWaifu_22B_v2.0_preview (Merge)
spow12_ChatWaifu_v1.4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/spow12/ChatWaifu_v1.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">spow12/ChatWaifu_v1.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/spow12__ChatWaifu_v1.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
spow12/ChatWaifu_v1.4
c5b2b30a8e9fa23722b6e30aa2ca1dab7fe1c2b5
25.379443
cc-by-nc-4.0
16
12.248
true
false
false
true
1.442143
0.569057
56.905677
0.517625
31.630554
0.086103
8.610272
0.307047
7.606264
0.474333
20.025
0.34749
27.498892
true
false
2024-09-03
2024-09-05
1
spow12/ChatWaifu_v1.4 (Merge)
spow12_ChatWaifu_v2.0_22B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/spow12/ChatWaifu_v2.0_22B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">spow12/ChatWaifu_v2.0_22B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/spow12__ChatWaifu_v2.0_22B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
spow12/ChatWaifu_v2.0_22B
54771319920ed791ba3f0262b036f37a92b880f2
28.838098
cc-by-nc-4.0
8
22.247
true
false
false
true
2.739835
0.651089
65.108911
0.59263
42.286228
0.185801
18.58006
0.324664
9.955257
0.384198
5.591406
0.383561
31.506723
true
false
2024-10-11
2024-10-11
1
spow12/ChatWaifu_v2.0_22B (Merge)
spow12_ChatWaifu_v2.0_22B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/spow12/ChatWaifu_v2.0_22B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">spow12/ChatWaifu_v2.0_22B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/spow12__ChatWaifu_v2.0_22B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
spow12/ChatWaifu_v2.0_22B
a6e7c206d9af77d3f85faf0ce4a711d62815b2ab
28.868659
cc-by-nc-4.0
8
22.247
true
false
false
true
1.39586
0.651738
65.17385
0.590805
42.019798
0.193353
19.335347
0.323826
9.8434
0.384198
5.591406
0.381233
31.248153
true
false
2024-10-11
2024-10-14
1
spow12/ChatWaifu_v2.0_22B (Merge)
ssmits_Qwen2.5-95B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ssmits/Qwen2.5-95B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ssmits/Qwen2.5-95B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ssmits__Qwen2.5-95B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ssmits/Qwen2.5-95B-Instruct
9c0e7df57a4fcf4d364efd916a0fc0abdd2d20a3
37.440125
other
3
94.648
true
false
false
true
19.233495
0.843105
84.310518
0.70378
58.530351
0.061178
6.117825
0.364094
15.212528
0.428385
13.614844
0.521692
46.854684
false
false
2024-09-24
2024-09-26
1
ssmits/Qwen2.5-95B-Instruct (Merge)
stabilityai_StableBeluga2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/StableBeluga2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/StableBeluga2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__StableBeluga2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/StableBeluga2
cb47d3db70ea3ddc2cabdeb358c303b328f65900
22.682842
884
68.977
true
false
false
false
6.254674
0.378714
37.871403
0.582413
41.263261
0.036254
3.625378
0.316275
8.836689
0.472969
18.654427
0.332613
25.845892
false
true
2023-07-20
2024-06-13
0
stabilityai/StableBeluga2
stabilityai_stablelm-2-12b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
StableLmForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-2-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-2-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__stablelm-2-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/stablelm-2-12b
fead13ddbf4492970666650c3cd6f85f485411ec
13.935722
other
118
12.143
true
false
false
false
1.473279
0.156921
15.692141
0.450865
22.685797
0.039275
3.927492
0.278523
3.803132
0.447885
14.485677
0.307181
23.020095
false
true
2024-03-21
2024-06-12
0
stabilityai/stablelm-2-12b
stabilityai_stablelm-2-12b-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
StableLmForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-2-12b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-2-12b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__stablelm-2-12b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/stablelm-2-12b-chat
b6b62cd451b84e848514c00fafa66d9ead9297c5
16.249477
other
88
12.143
true
false
false
true
1.088097
0.408165
40.816478
0.467202
25.253697
0.021903
2.190332
0.266779
2.237136
0.391427
7.728385
0.273438
19.270833
false
true
2024-04-04
2024-06-12
0
stabilityai/stablelm-2-12b-chat
stabilityai_stablelm-2-1_6b_float16
float16
🟢 pretrained
🟢
Original
StableLmForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-2-1_6b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-2-1_6b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__stablelm-2-1_6b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/stablelm-2-1_6b
8879812cccd176fbbe9ceb747b815bcc7d6499f8
5.216127
other
187
1.645
true
false
false
false
0.549872
0.115705
11.570522
0.338458
8.632695
0.001511
0.151057
0.248322
0
0.388198
5.791406
0.14636
5.151079
false
true
2024-01-18
2024-06-12
0
stabilityai/stablelm-2-1_6b
stabilityai_stablelm-2-1_6b-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
StableLmForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-2-1_6b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-2-1_6b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__stablelm-2-1_6b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/stablelm-2-1_6b-chat
f3fe67057c2789ae1bb1fe42b038da99840d4f13
8.640775
other
32
1.645
true
false
false
true
0.495427
0.305999
30.599919
0.339017
7.493378
0.011329
1.132931
0.247483
0
0.357969
5.71276
0.162151
6.905659
false
true
2024-04-08
2024-06-12
0
stabilityai/stablelm-2-1_6b-chat
stabilityai_stablelm-2-zephyr-1_6b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
StableLmForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-2-zephyr-1_6b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-2-zephyr-1_6b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__stablelm-2-zephyr-1_6b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/stablelm-2-zephyr-1_6b
2f275b1127d59fc31e4f7c7426d528768ada9ea4
9.281934
other
182
1.645
true
false
false
true
0.473089
0.327931
32.7931
0.335161
6.70871
0.022659
2.265861
0.243289
0
0.351146
5.993229
0.171376
7.930703
false
true
2024-01-19
2024-06-12
0
stabilityai/stablelm-2-zephyr-1_6b
stabilityai_stablelm-3b-4e1t_bfloat16
bfloat16
🟢 pretrained
🟢
Original
StableLmForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-3b-4e1t" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-3b-4e1t</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__stablelm-3b-4e1t-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/stablelm-3b-4e1t
fa4a6a92fca83c3b4223a3c9bf792887090ebfba
7.263251
cc-by-sa-4.0
309
2.795
true
false
false
false
0.434265
0.22032
22.031986
0.350421
9.01307
0.006798
0.679758
0.237416
0
0.377781
4.422656
0.166888
7.432033
false
true
2023-09-29
2024-08-10
0
stabilityai/stablelm-3b-4e1t
stabilityai_stablelm-zephyr-3b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
StableLmForCausalLM
<a target="_blank" href="https://huggingface.co/stabilityai/stablelm-zephyr-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">stabilityai/stablelm-zephyr-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/stabilityai__stablelm-zephyr-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
stabilityai/stablelm-zephyr-3b
a14f62d95754d96aea2be6e24c0f6966636797b9
12.356619
other
249
2.795
true
false
false
true
0.384024
0.368323
36.832272
0.386636
14.759119
0.042296
4.229607
0.239094
0
0.418302
9.78776
0.176779
8.530954
false
true
2023-11-21
2024-06-12
0
stabilityai/stablelm-zephyr-3b
sthenno-com_miscii-14b-1028_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sthenno-com/miscii-14b-1028" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sthenno-com/miscii-14b-1028</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sthenno-com__miscii-14b-1028-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sthenno-com/miscii-14b-1028
a60c866621ee35d04e84cf366e972f2466d617b1
35.054416
apache-2.0
17
14.77
true
false
false
true
1.533728
0.823671
82.367119
0.644833
49.262668
0.063444
6.344411
0.356544
14.205817
0.418156
12.002865
0.515293
46.143617
false
false
2024-11-12
2024-11-17
1
sthenno-com/miscii-14b-1028 (Merge)
sthenno-com_miscii-14b-1225_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sthenno-com/miscii-14b-1225" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sthenno-com/miscii-14b-1225</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sthenno-com__miscii-14b-1225-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sthenno-com/miscii-14b-1225
3d26f676424307cc2496c6b11710bbfa35275685
40.083651
apache-2.0
18
14.766
true
false
false
true
1.448497
0.787801
78.780081
0.657171
50.912806
0.31571
31.570997
0.377517
17.002237
0.436573
14.771615
0.527178
47.46417
true
false
2024-12-24
2024-12-24
1
sthenno-com/miscii-14b-1225 (Merge)
suayptalha_HomerCreativeAnvita-Mix-Qw7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/suayptalha/HomerCreativeAnvita-Mix-Qw7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">suayptalha/HomerCreativeAnvita-Mix-Qw7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/suayptalha__HomerCreativeAnvita-Mix-Qw7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
suayptalha/HomerCreativeAnvita-Mix-Qw7B
5be9b48b59652687d3e5b88f9e935b51869756ad
34.620978
apache-2.0
9
7.616
true
false
false
true
0.649881
0.780782
78.078166
0.556465
36.984168
0.310423
31.042296
0.314597
8.612975
0.441594
14.732552
0.444481
38.275709
true
false
2024-11-22
2024-11-24
1
suayptalha/HomerCreativeAnvita-Mix-Qw7B (Merge)
suayptalha_Komodo-Llama-3.2-3B-v2-fp16_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/suayptalha/Komodo-Llama-3.2-3B-v2-fp16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">suayptalha/Komodo-Llama-3.2-3B-v2-fp16</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/suayptalha__Komodo-Llama-3.2-3B-v2-fp16-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
suayptalha/Komodo-Llama-3.2-3B-v2-fp16
1ff4b55d952597429c249ca71dc08b823eba17c0
19.587262
apache-2.0
5
3
true
false
false
true
0.598065
0.634053
63.40532
0.4355
20.204329
0.062689
6.268882
0.277685
3.691275
0.340573
3.371615
0.285239
20.582151
false
false
2024-11-19
2024-11-19
1
suayptalha/Komodo-Llama-3.2-3B-v2-fp16 (Merge)
suayptalha_Rombos-2.5-T.E-8.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/suayptalha/Rombos-2.5-T.E-8.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">suayptalha/Rombos-2.5-T.E-8.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/suayptalha__Rombos-2.5-T.E-8.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
suayptalha/Rombos-2.5-T.E-8.1
c0ee2950b07377e1d0e01fc013a0f200b0306ea2
27.335179
cc-by-nc-sa-4.0
6
7.616
true
false
false
true
0.686016
0.692505
69.250478
0.551464
36.499861
0.008308
0.830816
0.311242
8.165548
0.416635
10.979427
0.444564
38.284944
true
false
2024-11-16
2024-11-16
1
suayptalha/Rombos-2.5-T.E-8.1 (Merge)
sumink_Qwenftmodel_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sumink/Qwenftmodel" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/Qwenftmodel</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__Qwenftmodel-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sumink/Qwenftmodel
7fe96b05b36aaa1be229c436b4fe3b476be9e2dd
9.903504
other
0
1.544
true
false
false
false
1.014414
0.172909
17.290899
0.38227
14.041352
0.077039
7.703927
0.256711
0.894855
0.361719
4.614844
0.233876
14.875148
false
false
2024-12-05
2024-12-05
0
sumink/Qwenftmodel
sumink_Qwenmplus_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sumink/Qwenmplus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/Qwenmplus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__Qwenmplus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sumink/Qwenmplus
2f6d29692e18a32bc179e81d09d4ecdefefb85d8
9.479028
other
0
1.543
true
false
false
false
1.095854
0.204033
20.403308
0.367551
12.706589
0.030211
3.021148
0.285235
4.697987
0.382833
5.020833
0.199219
11.024306
false
false
2025-01-03
2025-01-03
0
sumink/Qwenmplus
sumink_Qwensci_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/sumink/Qwensci" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/Qwensci</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__Qwensci-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sumink/Qwensci
5cfce5a410358536c582e79a8484600ae384991a
5.436658
other
0
1.543
true
false
false
false
1.04273
0.173983
17.398281
0.328187
6.319843
0.01284
1.283988
0.258389
1.118568
0.360885
3.610677
0.125997
2.888593
false
false
2025-01-03
2025-01-03
0
sumink/Qwensci
sumink_ftgpt_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/sumink/ftgpt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sumink/ftgpt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sumink__ftgpt-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sumink/ftgpt
fea7c59fff2443a73a7fd11a78b1d80eb5f0c4e6
3.951784
mit
0
0.124
true
false
false
false
0.052818
0.07871
7.871004
0.291909
1.931277
0
0
0.264262
1.901566
0.413844
10.097135
0.117188
1.909722
false
false
2024-11-06
2024-11-20
0
sumink/ftgpt
sunbaby_BrainCog-8B-0.1-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/sunbaby/BrainCog-8B-0.1-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">sunbaby/BrainCog-8B-0.1-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/sunbaby__BrainCog-8B-0.1-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
sunbaby/BrainCog-8B-0.1-Instruct
6c03cb7af723c7f7785df9eee5d5838247619bee
18.040754
apache-2.0
0
8.03
true
false
false
true
0.834554
0.4253
42.530043
0.461822
24.283468
0.076284
7.628399
0.301174
6.823266
0.365594
6.332552
0.285821
20.646794
false
false
2024-07-31
2024-08-27
1
meta-llama/Meta-Llama-3-8B
swap-uniba_LLaMAntino-3-ANITA-8B-Inst-DPO-ITA_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/swap-uniba/LLaMAntino-3-ANITA-8B-Inst-DPO-ITA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">swap-uniba/LLaMAntino-3-ANITA-8B-Inst-DPO-ITA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/swap-uniba__LLaMAntino-3-ANITA-8B-Inst-DPO-ITA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
swap-uniba/LLaMAntino-3-ANITA-8B-Inst-DPO-ITA
2b6e46e4c9d341dc8bf8350a167492c880116b66
21.752024
llama3
24
8.03
true
false
false
false
0.816639
0.481505
48.150463
0.49357
27.990828
0.043807
4.380665
0.298658
6.487696
0.43874
13.242448
0.37234
30.260047
false
false
2024-04-29
2024-10-25
1
meta-llama/Meta-Llama-3-8B-Instruct
synergetic_FrankenQwen2.5-14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/synergetic/FrankenQwen2.5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">synergetic/FrankenQwen2.5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/synergetic__FrankenQwen2.5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
synergetic/FrankenQwen2.5-14B
24e41619569b50aa44698e0afabbbee30af998bd
18.126546
0
16.972
false
false
false
true
2.258982
0.186947
18.69473
0.604775
44.273555
0
0
0.270134
2.684564
0.38426
5.532552
0.438165
37.573877
false
false
2024-11-30
2024-11-30
1
synergetic/FrankenQwen2.5-14B (Merge)
talha2001_Beast-Soul-new_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/talha2001/Beast-Soul-new" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">talha2001/Beast-Soul-new</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/talha2001__Beast-Soul-new-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
talha2001/Beast-Soul-new
e6cf8caa60264a3005df2ff4b9d967f684519d4b
21.804866
0
7.242
false
false
false
false
0.642883
0.485351
48.535109
0.522714
33.072759
0.074773
7.477341
0.281879
4.250559
0.445927
14.140885
0.310173
23.352541
false
false
2024-08-07
2024-08-07
1
talha2001/Beast-Soul-new (Merge)
tangledgroup_tangled-llama-pints-1.5b-v0.1-instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tangledgroup/tangled-llama-pints-1.5b-v0.1-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tangledgroup/tangled-llama-pints-1.5b-v0.1-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tangledgroup__tangled-llama-pints-1.5b-v0.1-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tangledgroup/tangled-llama-pints-1.5b-v0.1-instruct
3e1429f20007740877c51e44ed63b870a57a2e17
4.190264
apache-2.0
0
1.5
true
false
false
true
0.295434
0.150902
15.090183
0.314344
3.842195
0.001511
0.151057
0.239933
0
0.376135
4.85026
0.110871
1.20789
false
false
2024-08-27
2024-08-29
1
pints-ai/1.5-Pints-16K-v0.1
tangledgroup_tangled-llama-pints-1.5b-v0.2-instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tangledgroup/tangled-llama-pints-1.5b-v0.2-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tangledgroup/tangled-llama-pints-1.5b-v0.2-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tangledgroup__tangled-llama-pints-1.5b-v0.2-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tangledgroup/tangled-llama-pints-1.5b-v0.2-instruct
5c229e26f3ab3d0f0f613ed242f3f0f57c930155
4.65774
apache-2.0
0
1.5
true
false
false
true
0.297811
0.172409
17.240921
0.315835
4.080205
0.007553
0.755287
0.241611
0
0.364292
4.569792
0.111702
1.300236
false
false
2024-09-14
2024-09-15
1
pints-ai/1.5-Pints-16K-v0.1
tanliboy_lambda-gemma-2-9b-dpo_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/tanliboy/lambda-gemma-2-9b-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tanliboy/lambda-gemma-2-9b-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tanliboy__lambda-gemma-2-9b-dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tanliboy/lambda-gemma-2-9b-dpo
b141471308bc41ffe15180a6668c735396c3949b
21.33689
gemma
1
9.242
true
false
false
true
2.241587
0.45008
45.008023
0.547172
35.554545
0
0
0.313758
8.501119
0.401656
7.940365
0.379156
31.017287
false
false
2024-07-24
2024-09-18
2
google/gemma-2-9b
tanliboy_lambda-gemma-2-9b-dpo_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/tanliboy/lambda-gemma-2-9b-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tanliboy/lambda-gemma-2-9b-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tanliboy__lambda-gemma-2-9b-dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tanliboy/lambda-gemma-2-9b-dpo
b141471308bc41ffe15180a6668c735396c3949b
16.970109
gemma
1
9.242
true
false
false
true
2.903576
0.182925
18.292464
0.548791
35.739663
0
0
0.310403
8.053691
0.405625
8.569792
0.380485
31.165041
false
false
2024-07-24
2024-09-18
2
google/gemma-2-9b
tanliboy_lambda-qwen2.5-14b-dpo-test_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/tanliboy/lambda-qwen2.5-14b-dpo-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tanliboy/lambda-qwen2.5-14b-dpo-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tanliboy__lambda-qwen2.5-14b-dpo-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tanliboy/lambda-qwen2.5-14b-dpo-test
96607eea3c67f14f73e576580610dba7530c5dd9
33.516192
apache-2.0
7
14.77
true
false
false
true
1.800743
0.823122
82.312154
0.639351
48.45444
0
0
0.362416
14.988814
0.426031
12.58724
0.484791
42.754507
false
false
2024-09-20
2024-09-20
2
Qwen/Qwen2.5-14B
tanliboy_lambda-qwen2.5-32b-dpo-test_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/tanliboy/lambda-qwen2.5-32b-dpo-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tanliboy/lambda-qwen2.5-32b-dpo-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tanliboy__lambda-qwen2.5-32b-dpo-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tanliboy/lambda-qwen2.5-32b-dpo-test
675b60d6e859455a6139e6e284bbe1844b8ddf46
35.753394
apache-2.0
4
32.764
true
false
false
true
5.499303
0.808384
80.838398
0.67639
54.407961
0
0
0.356544
14.205817
0.427427
13.328385
0.565658
51.739805
false
false
2024-09-22
2024-09-30
2
Qwen/Qwen2.5-32B
tannedbum_Ellaria-9B_float16
float16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/tannedbum/Ellaria-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tannedbum/Ellaria-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tannedbum__Ellaria-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tannedbum/Ellaria-9B
087b263326da56de637912814bc7073b83b8d59a
30.028824
15
10.159
false
false
false
true
1.857113
0.78258
78.258022
0.59421
41.721561
0.026435
2.643505
0.333054
11.073826
0.415146
10.859896
0.420545
35.616135
false
false
2024-08-04
2025-01-07
1
tannedbum/Ellaria-9B (Merge)
tannedbum_L3-Nymeria-Maid-8B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tannedbum/L3-Nymeria-Maid-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tannedbum/L3-Nymeria-Maid-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tannedbum__L3-Nymeria-Maid-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tannedbum/L3-Nymeria-Maid-8B
17cf2c77399d63638254353ac86adf5692b79c62
26.043176
cc-by-nc-4.0
10
8.03
true
false
false
true
0.455798
0.725003
72.500299
0.514606
31.240945
0.093656
9.365559
0.296141
6.152125
0.375052
6.48151
0.374668
30.518617
true
false
2024-06-21
2025-01-07
1
tannedbum/L3-Nymeria-Maid-8B (Merge)
tannedbum_L3-Nymeria-v2-8B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tannedbum/L3-Nymeria-v2-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tannedbum/L3-Nymeria-v2-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tannedbum__L3-Nymeria-v2-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tannedbum/L3-Nymeria-v2-8B
6f0f2526cc89c9d749b850c3e1c3484db92e5c3b
25.747182
cc-by-nc-4.0
15
8.03
true
false
false
true
0.492488
0.716835
71.683467
0.52242
32.262544
0.094411
9.441088
0.290268
5.369128
0.369875
5.134375
0.375332
30.592494
true
false
2024-06-29
2025-01-07
1
tannedbum/L3-Nymeria-v2-8B (Merge)
tannedbum_L3-Rhaenys-8B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tannedbum/L3-Rhaenys-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tannedbum/L3-Rhaenys-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tannedbum__L3-Rhaenys-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tannedbum/L3-Rhaenys-8B
a159e2aabf9d6ef31444dc46c3dce9fdadca77d9
26.505176
cc-by-nc-4.0
5
8.03
true
false
false
true
0.547883
0.736269
73.626866
0.529921
33.137944
0.090634
9.063444
0.297819
6.375839
0.372479
5.726563
0.379904
31.100399
true
false
2024-07-31
2025-01-07
1
tannedbum/L3-Rhaenys-8B (Merge)
teknium_CollectiveCognition-v1.1-Mistral-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/teknium/CollectiveCognition-v1.1-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/CollectiveCognition-v1.1-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/teknium__CollectiveCognition-v1.1-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
teknium/CollectiveCognition-v1.1-Mistral-7B
5f57f70ec99450c70da2540e94dd7fd67be4b23c
14.268985
apache-2.0
78
7
true
false
false
false
0.429318
0.279046
27.904626
0.449343
23.476134
0.031722
3.172205
0.286913
4.9217
0.386927
5.732552
0.28366
20.406693
false
true
2023-10-04
2024-06-12
1
mistralai/Mistral-7B-v0.1
teknium_OpenHermes-13B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/teknium/OpenHermes-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/OpenHermes-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/teknium__OpenHermes-13B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
teknium/OpenHermes-13B
bcad6fff9f8591e091d2d57356a3f102197e8c5f
12.169676
mit
54
13
true
false
false
false
31.119117
0.266807
26.680652
0.420644
18.213328
0.011329
1.132931
0.272651
3.020134
0.40426
8.532552
0.238946
15.43846
false
true
2023-09-06
2024-06-12
1
NousResearch/Llama-2-13b-hf
teknium_OpenHermes-2-Mistral-7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/teknium/OpenHermes-2-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/OpenHermes-2-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/teknium__OpenHermes-2-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
teknium/OpenHermes-2-Mistral-7B
4c6e34123b140ce773a8433cae5410949289102c
21.4153
apache-2.0
255
7
true
false
false
true
0.47503
0.528615
52.861519
0.494752
29.251839
0.043807
4.380665
0.283557
4.474273
0.451979
16.064062
0.293135
21.459441
false
true
2023-10-12
2024-06-12
1
mistralai/Mistral-7B-v0.1
teknium_OpenHermes-2.5-Mistral-7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/teknium/OpenHermes-2.5-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/OpenHermes-2.5-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/teknium__OpenHermes-2.5-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
teknium/OpenHermes-2.5-Mistral-7B
24c0bea14d53e6f67f1fbe2eca5bfe7cae389b33
21.266837
apache-2.0
821
7.242
true
false
false
true
0.472783
0.557142
55.714172
0.487001
27.770026
0.047583
4.758308
0.283557
4.474273
0.424198
12.058073
0.305436
22.826167
false
true
2023-10-29
2024-06-12
1
mistralai/Mistral-7B-v0.1
teknium_OpenHermes-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/teknium/OpenHermes-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">teknium/OpenHermes-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/teknium__OpenHermes-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
teknium/OpenHermes-7B
9f55d6eb15f1edd52ee1fd863a220aa682e78a00
9.481132
mit
13
7
true
false
false
false
2.48309
0.181251
18.12513
0.362034
12.081395
0.010574
1.057402
0.269295
2.572707
0.432385
12.68151
0.193318
10.368647
false
true
2023-09-14
2024-06-12
1
NousResearch/Llama-2-7b-hf
tensoropera_Fox-1-1.6B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tensoropera/Fox-1-1.6B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tensoropera/Fox-1-1.6B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tensoropera__Fox-1-1.6B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tensoropera/Fox-1-1.6B
6389dde4d7e52aa1200ad954c565f03c7fdcf8db
7.739189
apache-2.0
31
1.665
true
false
false
false
1.34282
0.276598
27.659831
0.330737
7.399761
0.015861
1.586103
0.263423
1.789709
0.35499
3.873698
0.137134
4.126034
false
false
2024-06-13
2024-06-29
0
tensoropera/Fox-1-1.6B
tenyx_Llama3-TenyxChat-70B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tenyx/Llama3-TenyxChat-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tenyx/Llama3-TenyxChat-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tenyx__Llama3-TenyxChat-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tenyx/Llama3-TenyxChat-70B
a85d31e3af8fcc847cc9169f1144cf02f5351fab
36.872248
llama3
64
70.554
true
false
false
true
9.367007
0.808709
80.870867
0.651149
49.61562
0.246224
24.622356
0.301174
6.823266
0.426031
12.520573
0.521027
46.780807
false
false
2024-04-26
2024-08-04
0
tenyx/Llama3-TenyxChat-70B
theprint_Boptruth-Agatha-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/theprint/Boptruth-Agatha-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/Boptruth-Agatha-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__Boptruth-Agatha-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/Boptruth-Agatha-7B
ef7c7570be29a58f4a8358a6d4c75f59a5282191
17.44944
0
7.242
false
false
false
false
0.388105
0.312419
31.241883
0.498394
29.286422
0.05136
5.135952
0.299497
6.599553
0.427667
11.758333
0.28607
20.674498
false
false
2024-09-11
2024-09-30
0
theprint/Boptruth-Agatha-7B
theprint_CleverBoi-7B-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/theprint/CleverBoi-7B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/CleverBoi-7B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__CleverBoi-7B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/CleverBoi-7B-v2
1d82629c1e6778cf8568b532a3c09b668805b15a
15.032974
apache-2.0
0
7.736
true
false
false
false
1.522398
0.216998
21.699757
0.453173
23.444181
0.022659
2.265861
0.288591
5.145414
0.469531
18.658073
0.270861
18.98456
false
false
2024-09-12
2024-09-13
2
mistralai/Mistral-7B-v0.3
theprint_CleverBoi-7B-v3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/theprint/CleverBoi-7B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/CleverBoi-7B-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__CleverBoi-7B-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/CleverBoi-7B-v3
1d82629c1e6778cf8568b532a3c09b668805b15a
13.589762
apache-2.0
0
7.736
true
false
false
false
1.60289
0.23823
23.823012
0.441443
21.936747
0.033988
3.398792
0.26594
2.12528
0.407177
9.497135
0.286818
20.757609
false
false
2024-09-14
2024-09-22
2
mistralai/Mistral-7B-v0.3
theprint_CleverBoi-Llama-3.1-8B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/theprint/CleverBoi-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/CleverBoi-Llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__CleverBoi-Llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/CleverBoi-Llama-3.1-8B-Instruct
3514c510ea4ba4d650522f467d4d0cef7de4a43c
13.605339
apache-2.0
1
16.061
true
false
false
false
1.870223
0.168163
16.81627
0.455962
24.048603
0.02719
2.719033
0.300336
6.711409
0.401438
8.279688
0.307513
23.057033
false
false
2024-08-27
2024-09-13
3
meta-llama/Meta-Llama-3.1-8B
theprint_CleverBoi-Llama-3.1-8B-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/theprint/CleverBoi-Llama-3.1-8B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/CleverBoi-Llama-3.1-8B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__CleverBoi-Llama-3.1-8B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/CleverBoi-Llama-3.1-8B-v2
a8b0fc584b10e0110e04f9d21c7f10d24391c1d5
14.095235
apache-2.0
0
9.3
true
false
false
false
2.521379
0.19614
19.613958
0.466782
24.132845
0.049849
4.984894
0.286074
4.809843
0.373469
6.716927
0.318816
24.312943
false
false
2024-09-15
2024-09-22
2
meta-llama/Meta-Llama-3.1-8B
theprint_CleverBoi-Nemo-12B-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/theprint/CleverBoi-Nemo-12B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/CleverBoi-Nemo-12B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__CleverBoi-Nemo-12B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/CleverBoi-Nemo-12B-v2
cd1f9ee1c484f857bb0e5ae6aac37dc434911f10
17.68216
apache-2.0
4
13.933
true
false
false
false
3.505513
0.204583
20.458273
0.524109
31.652695
0.0929
9.29003
0.313758
8.501119
0.418677
11.434635
0.322806
24.756206
false
false
2024-09-16
2024-09-24
1
unsloth/Mistral-Nemo-Instruct-2407-bnb-4bit