eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
3 values
Architecture
stringclasses
59 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
1.03
52
Hub License
stringclasses
25 values
Hub ❤️
int64
0
5.96k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.03
107
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.27
0.75
BBH
float64
0.81
63.5
MATH Lvl 5 Raw
float64
0
0.51
MATH Lvl 5
float64
0
50.7
GPQA Raw
float64
0.22
0.44
GPQA
float64
0
24.9
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.5
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
457 values
Submission Date
stringclasses
200 values
Generation
int64
0
10
Base Model
stringlengths
4
102
theprint_Code-Llama-Bagel-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/theprint/Code-Llama-Bagel-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/Code-Llama-Bagel-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__Code-Llama-Bagel-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/Code-Llama-Bagel-8B
7fa415f3f758ab7930d7e1df27b2d16207513125
14.526782
llama3
1
8.03
true
false
false
false
0.818047
0.252968
25.296768
0.469742
25.338155
0.05287
5.287009
0.276007
3.467562
0.367979
7.530729
0.282164
20.24047
true
false
2024-06-21
2024-09-13
1
theprint/Code-Llama-Bagel-8B (Merge)
theprint_Conversely-Mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/theprint/Conversely-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/Conversely-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__Conversely-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/Conversely-Mistral-7B
d8cadc02ac76bd617a919d50b092e59d2d110aff
14.717953
apache-2.0
0
14.496
true
false
false
false
1.03698
0.260811
26.081131
0.467235
25.706966
0.009063
0.906344
0.285235
4.697987
0.418896
10.628646
0.28258
20.286643
false
false
2024-12-05
2024-12-07
2
mistralai/Mistral-7B-v0.3
theprint_Llama-3.2-3B-VanRossum_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/theprint/Llama-3.2-3B-VanRossum" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/Llama-3.2-3B-VanRossum</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__Llama-3.2-3B-VanRossum-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/Llama-3.2-3B-VanRossum
7048abecd492a1f5d53981cb175431ec01bbced0
17.521868
apache-2.0
0
3.696
true
false
false
false
1.854588
0.478282
47.828207
0.427874
19.366362
0.093656
9.365559
0.267617
2.348993
0.344167
6.554167
0.277011
19.667923
false
false
2024-11-14
2024-11-14
2
meta-llama/Llama-3.2-3B-Instruct
theprint_ReWiz-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/theprint/ReWiz-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/ReWiz-7B
d9f28e67d52181d1478e7788e3edf252f5bf32a8
17.598219
apache-2.0
0
7.736
true
false
false
false
1.445406
0.404793
40.479262
0.456422
23.50443
0.029456
2.945619
0.275168
3.355705
0.461156
16.744531
0.267038
18.559767
false
false
2024-10-08
2024-10-08
3
mistralai/Mistral-7B-v0.3
theprint_ReWiz-Llama-3.1-8B-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/theprint/ReWiz-Llama-3.1-8B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-Llama-3.1-8B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-Llama-3.1-8B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/ReWiz-Llama-3.1-8B-v2
a8b0fc584b10e0110e04f9d21c7f10d24391c1d5
15.681926
apache-2.0
1
9.3
true
false
false
false
2.327395
0.237306
23.73059
0.463243
23.773287
0.045317
4.531722
0.302852
7.04698
0.381375
9.338542
0.331034
25.670434
false
false
2024-11-02
2024-11-03
2
meta-llama/Meta-Llama-3.1-8B
theprint_ReWiz-Llama-3.2-3B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/theprint/ReWiz-Llama-3.2-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-Llama-3.2-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-Llama-3.2-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/ReWiz-Llama-3.2-3B
e6aed95ad8f104f105b8423cd5f87c75705a828c
17.984844
apache-2.0
2
3.213
true
false
false
false
1.30972
0.464893
46.489315
0.434326
19.293728
0.097432
9.743202
0.283557
4.474273
0.361375
6.938542
0.28873
20.970006
false
false
2024-10-18
2024-10-28
1
theprint/ReWiz-Llama-3.2-3B (Merge)
theprint_ReWiz-Nemo-12B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/theprint/ReWiz-Nemo-12B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-Nemo-12B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-Nemo-12B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/ReWiz-Nemo-12B-Instruct
6f8ea24f8d19b48850d68bef1b5c50837d37761b
15.631853
apache-2.0
2
12.248
true
false
false
false
1.17003
0.106238
10.623811
0.509241
29.926389
0.071752
7.175227
0.323826
9.8434
0.409563
10.228646
0.333943
25.993647
false
false
2024-10-31
2024-11-02
1
unsloth/Mistral-Nemo-Instruct-2407-bnb-4bit
theprint_ReWiz-Qwen-2.5-14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/theprint/ReWiz-Qwen-2.5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-Qwen-2.5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-Qwen-2.5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/ReWiz-Qwen-2.5-14B
e5524628f15c30d7542427c53a565e6e2d3ff760
29.641502
apache-2.0
5
16.743
true
false
false
false
5.928266
0.278546
27.854648
0.617949
44.861873
0.268882
26.888218
0.380034
17.337808
0.453896
15.436979
0.509225
45.469489
false
false
2024-11-05
2024-11-10
2
Qwen/Qwen2.5-14B
theprint_ReWiz-Worldbuilder-7B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/theprint/ReWiz-Worldbuilder-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/ReWiz-Worldbuilder-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__ReWiz-Worldbuilder-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/ReWiz-Worldbuilder-7B
e88c715097d824f115f59a97e612d662ffb1031f
15.664819
0
7.248
false
false
false
false
0.610867
0.25102
25.101952
0.463616
25.076347
0.029456
2.945619
0.269295
2.572707
0.45725
16.389583
0.297124
21.902704
false
false
2024-10-28
2024-10-28
1
theprint/ReWiz-Worldbuilder-7B (Merge)
theprint_RuDolph-Hermes-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/theprint/RuDolph-Hermes-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/RuDolph-Hermes-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__RuDolph-Hermes-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/RuDolph-Hermes-7B
e07aea56963bbfe5c6753d1056566a56acc30d4a
19.024425
0
7.242
false
false
false
false
0.502067
0.360429
36.042922
0.505293
30.709648
0.050604
5.060423
0.312081
8.277405
0.422615
11.026823
0.307264
23.029329
false
false
2024-11-10
2024-11-10
1
theprint/RuDolph-Hermes-7B (Merge)
theprint_WorldBuilder-12B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/theprint/WorldBuilder-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/WorldBuilder-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__WorldBuilder-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/WorldBuilder-12B
20cfd0e98fb2628b00867147b2c6f423d27f3561
14.377937
apache-2.0
0
13.933
true
false
false
false
2.831275
0.137438
13.743755
0.50101
29.277996
0.036254
3.625378
0.29698
6.263982
0.406646
8.997396
0.319232
24.359116
false
false
2024-10-27
2024-11-18
1
unsloth/mistral-nemo-base-2407-bnb-4bit
theprint_phi-3-mini-4k-python_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/theprint/phi-3-mini-4k-python" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">theprint/phi-3-mini-4k-python</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/theprint__phi-3-mini-4k-python-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
theprint/phi-3-mini-4k-python
81453e5718775630581ab9950e6c0ccf0d7a4177
17.564493
apache-2.0
0
4.132
true
false
false
false
1.375551
0.240878
24.087754
0.493759
28.446016
0.095166
9.516616
0.291107
5.480984
0.392167
9.220833
0.357713
28.634752
false
false
2024-06-03
2024-09-13
1
unsloth/Phi-3-mini-4k-instruct-bnb-4bit
thomas-yanxin_XinYuan-Qwen2-1_5B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/thomas-yanxin/XinYuan-Qwen2-1_5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thomas-yanxin/XinYuan-Qwen2-1_5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/thomas-yanxin__XinYuan-Qwen2-1_5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
thomas-yanxin/XinYuan-Qwen2-1_5B
a01b362887832bea08d686737861ac3d5b437a32
11.515091
other
1
1.777
true
false
false
true
1.352364
0.298556
29.855561
0.363549
12.12558
0.067221
6.722054
0.270134
2.684564
0.363396
2.624479
0.235705
15.07831
false
false
2024-08-25
2024-09-04
1
Removed
thomas-yanxin_XinYuan-Qwen2-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/thomas-yanxin/XinYuan-Qwen2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thomas-yanxin/XinYuan-Qwen2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/thomas-yanxin__XinYuan-Qwen2-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
thomas-yanxin/XinYuan-Qwen2-7B
c62d83eee2f4812ac17fc17d307f4aa1a77c5359
22.217714
other
1
7.616
true
false
false
true
3.276154
0.44376
44.376033
0.493663
28.401489
0.132931
13.293051
0.291107
5.480984
0.405812
9.259896
0.392453
32.494829
false
false
2024-08-21
2024-09-03
0
thomas-yanxin/XinYuan-Qwen2-7B
thomas-yanxin_XinYuan-Qwen2-7B-0917_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/thomas-yanxin/XinYuan-Qwen2-7B-0917" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thomas-yanxin/XinYuan-Qwen2-7B-0917</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/thomas-yanxin__XinYuan-Qwen2-7B-0917-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
thomas-yanxin/XinYuan-Qwen2-7B-0917
6cee1b155fca9ae1f558f434953dfdadb9596af0
22.721617
other
4
7.616
true
false
false
true
1.485564
0.37192
37.191984
0.516922
32.619938
0.088369
8.836858
0.309564
7.941834
0.440104
13.679688
0.424535
36.059397
false
false
2024-09-17
2024-09-17
0
thomas-yanxin/XinYuan-Qwen2-7B-0917
thomas-yanxin_XinYuan-Qwen2.5-7B-0917_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/thomas-yanxin/XinYuan-Qwen2.5-7B-0917" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">thomas-yanxin/XinYuan-Qwen2.5-7B-0917</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/thomas-yanxin__XinYuan-Qwen2.5-7B-0917-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
thomas-yanxin/XinYuan-Qwen2.5-7B-0917
bbbeafd1003c4d5e13f09b7223671957384b961a
18.175037
other
4
7.616
true
false
false
true
0.971225
0.357706
35.770644
0.518411
33.439669
0
0
0.28104
4.138702
0.367552
3.677344
0.388215
32.023862
false
false
2024-09-17
2024-09-24
0
thomas-yanxin/XinYuan-Qwen2.5-7B-0917
tiiuae_Falcon3-10B-Base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-10B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-10B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__Falcon3-10B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/Falcon3-10B-Base
0b20cceec08ec598ed2de7a6dfbeb208f1eae656
27.592675
other
33
10.306
true
false
false
false
0.810389
0.364775
36.477546
0.595004
41.375462
0.247734
24.773414
0.345638
12.751678
0.439792
14.173958
0.424036
36.003989
false
true
2024-12-03
2024-12-12
0
tiiuae/Falcon3-10B-Base
tiiuae_Falcon3-10B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-10B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-10B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__Falcon3-10B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/Falcon3-10B-Instruct
9be8471432d7c4f35f72505fa2ca4101f0a2ed6d
35.185885
other
81
10.306
true
false
false
true
0.840411
0.781656
78.165601
0.617047
44.82154
0.259063
25.906344
0.328859
10.514541
0.432323
13.607031
0.442902
38.100251
false
true
2024-12-14
2024-12-16
1
tiiuae/Falcon3-10B-Base
tiiuae_Falcon3-1B-Base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-1B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-1B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__Falcon3-1B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/Falcon3-1B-Base
cc56a5a7c3923821312ad14f52c5a7c3fa835cbc
9.837744
other
13
1.669
true
false
false
false
0.401369
0.242801
24.280132
0.357115
11.343173
0.030211
3.021148
0.279362
3.914989
0.41474
9.709115
0.160821
6.757905
false
true
2024-12-13
2024-12-16
0
tiiuae/Falcon3-1B-Base
tiiuae_Falcon3-1B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-1B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-1B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__Falcon3-1B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/Falcon3-1B-Instruct
27dd70ccb22fd3cc71c5adbc95eb670455afff3d
15.321193
other
28
1.669
true
false
false
true
0.39702
0.555668
55.566785
0.374454
12.961374
0.01284
1.283988
0.266779
2.237136
0.418896
10.561979
0.183843
9.315898
false
true
2024-12-14
2024-12-16
1
tiiuae/Falcon3-1B-Base
tiiuae_Falcon3-3B-Base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-3B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-3B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__Falcon3-3B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/Falcon3-3B-Base
3d49753006a0fa5384031a737c60fbcd0f60b7f2
15.751331
other
15
3.228
true
false
false
false
0.481216
0.276499
27.649858
0.442137
21.584784
0.11858
11.858006
0.29698
6.263982
0.37499
6.273698
0.287899
20.87766
false
true
2024-12-13
2024-12-13
0
tiiuae/Falcon3-3B-Base
tiiuae_Falcon3-3B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__Falcon3-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/Falcon3-3B-Instruct
552213004cecf9bb6ce332f46da0d4324c8347f1
26.551992
other
22
3.228
true
false
false
true
0.480464
0.697676
69.76755
0.475443
26.287229
0.246979
24.697885
0.288591
5.145414
0.413594
11.132552
0.300532
22.281324
false
true
2024-12-14
2024-12-16
0
tiiuae/Falcon3-3B-Instruct
tiiuae_Falcon3-7B-Base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-7B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-7B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__Falcon3-7B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/Falcon3-7B-Base
a1cf49eb7a53210fc2ee82f3876bbc7efb2244fd
24.720549
other
21
7.456
true
false
false
false
0.609372
0.341595
34.159475
0.509888
31.559919
0.192598
19.259819
0.346477
12.863535
0.470208
18.142708
0.391041
32.33784
false
true
2024-11-21
2024-12-12
0
tiiuae/Falcon3-7B-Base
tiiuae_Falcon3-7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__Falcon3-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/Falcon3-7B-Instruct
7aae4f3953f3dbfaa81aeecbb404a6bbba0e0c06
34.906699
other
43
7.456
true
false
false
true
0.618761
0.761248
76.124793
0.563244
37.915812
0.318731
31.873112
0.310403
8.053691
0.482677
21.167969
0.408743
34.304817
false
true
2024-11-29
2024-12-16
1
tiiuae/Falcon3-7B-Base
tiiuae_Falcon3-Mamba-7B-Base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
FalconMambaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-Mamba-7B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-Mamba-7B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__Falcon3-Mamba-7B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/Falcon3-Mamba-7B-Base
f08d14145ce86c32dd04f18bacb3f12b247042e2
18.126204
other
17
7.273
true
false
false
false
0.836318
0.289113
28.911289
0.469928
25.534049
0.193353
19.335347
0.309564
7.941834
0.343146
4.393229
0.303773
22.641475
false
true
2024-12-11
2024-12-12
0
tiiuae/Falcon3-Mamba-7B-Base
tiiuae_Falcon3-Mamba-7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
FalconMambaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/Falcon3-Mamba-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/Falcon3-Mamba-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__Falcon3-Mamba-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/Falcon3-Mamba-7B-Instruct
382561849d1509b5f1a4d7a38bb286b3c4f46fbd
27.643894
other
19
7.273
true
false
false
true
0.828498
0.71651
71.650997
0.467896
25.203505
0.272659
27.265861
0.303691
7.158837
0.386865
8.258073
0.336935
26.326093
false
true
2024-12-13
2024-12-13
1
tiiuae/Falcon3-Mamba-7B-Instruct (Merge)
tiiuae_falcon-11B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-11B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-11B
066e3bf4e2d9aaeefa129af0a6d39727d27816b3
13.814138
unknown
212
11.103
true
false
false
false
1.082871
0.326132
32.613244
0.439164
21.937999
0.02568
2.567976
0.270973
2.796421
0.398646
7.530729
0.238946
15.43846
false
true
2024-05-09
2024-06-09
0
tiiuae/falcon-11B
tiiuae_falcon-40b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-40b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-40b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-40b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-40b
4a70170c215b36a3cce4b4253f6d0612bb7d4146
11.36354
apache-2.0
2,424
40
true
false
false
false
21.793584
0.249645
24.964539
0.401853
16.583305
0.015861
1.586103
0.27349
3.131991
0.363146
5.193229
0.250499
16.722074
false
true
2023-05-24
2024-06-09
0
tiiuae/falcon-40b
tiiuae_falcon-40b-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-40b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-40b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-40b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-40b-instruct
ecb78d97ac356d098e79f0db222c9ce7c5d9ee5f
10.434154
apache-2.0
1,174
40
true
false
false
false
19.733245
0.245449
24.544874
0.405387
17.220114
0.016616
1.661631
0.25
0
0.376229
5.161979
0.226147
14.016327
false
true
2023-05-25
2024-06-09
0
tiiuae/falcon-40b-instruct
tiiuae_falcon-7b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-7b
898df1396f35e447d5fe44e0a3ccaaaa69f30d36
5.110504
apache-2.0
1,083
7
true
false
false
false
0.785841
0.182051
18.20514
0.328524
5.963937
0.006042
0.60423
0.244966
0
0.377844
4.497135
0.112533
1.392583
false
true
2023-04-24
2024-06-09
0
tiiuae/falcon-7b
tiiuae_falcon-7b-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
FalconForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-7b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-7b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-7b-instruct
cf4b3c42ce2fdfe24f753f0f0d179202fea59c99
5.015869
apache-2.0
935
7
true
false
false
false
0.766215
0.196889
19.68887
0.320342
4.823178
0.006042
0.60423
0.247483
0
0.363365
3.253906
0.115525
1.72503
false
true
2023-04-25
2024-06-09
0
tiiuae/falcon-7b-instruct
tiiuae_falcon-mamba-7b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
FalconMambaForCausalLM
<a target="_blank" href="https://huggingface.co/tiiuae/falcon-mamba-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tiiuae/falcon-mamba-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tiiuae__falcon-mamba-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tiiuae/falcon-mamba-7b
5337fd73f19847e111ba2291f3f0e1617b90c37d
15.116297
other
225
7
true
false
false
false
3.610408
0.333576
33.357602
0.428485
19.876878
0.040785
4.07855
0.310403
8.053691
0.421031
10.86224
0.230219
14.468824
false
true
2024-07-17
2024-07-23
0
tiiuae/falcon-mamba-7b
tklohj_WindyFloLLM_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tklohj/WindyFloLLM" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tklohj/WindyFloLLM</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tklohj__WindyFloLLM-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tklohj/WindyFloLLM
21f4241ab3f091d1d309e9076a8d8e3f014908a8
14.205891
0
13.016
false
false
false
false
1.098512
0.266856
26.685639
0.463662
24.398763
0.013595
1.359517
0.275168
3.355705
0.425313
11.864063
0.258145
17.571661
false
false
2024-06-30
2024-07-10
1
tklohj/WindyFloLLM (Merge)
togethercomputer_GPT-JT-6B-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTJForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/GPT-JT-6B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/GPT-JT-6B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__GPT-JT-6B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/GPT-JT-6B-v1
f34aa35f906895602c1f86f5685e598afdea8051
6.827354
apache-2.0
301
6
true
false
false
false
37.958811
0.206106
20.610646
0.330266
7.318524
0.007553
0.755287
0.260906
1.454139
0.373656
3.873698
0.162566
6.951832
false
true
2022-11-24
2024-06-12
0
togethercomputer/GPT-JT-6B-v1
togethercomputer_GPT-NeoXT-Chat-Base-20B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/GPT-NeoXT-Chat-Base-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/GPT-NeoXT-Chat-Base-20B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__GPT-NeoXT-Chat-Base-20B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/GPT-NeoXT-Chat-Base-20B
d386708e84d862a65f7d2b4989f64750cb657227
4.964062
apache-2.0
696
20
true
false
false
false
2.983588
0.182976
18.297562
0.332097
6.830795
0.01284
1.283988
0.25
0
0.346063
1.757812
0.114528
1.614214
false
true
2023-03-03
2024-06-12
0
togethercomputer/GPT-NeoXT-Chat-Base-20B
togethercomputer_LLaMA-2-7B-32K_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/LLaMA-2-7B-32K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/LLaMA-2-7B-32K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__LLaMA-2-7B-32K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/LLaMA-2-7B-32K
46c24bb5aef59722fa7aa6d75e832afd1d64b980
6.737011
llama2
537
7
true
false
false
false
0.584573
0.186497
18.649738
0.339952
8.089984
0.008308
0.830816
0.25
0
0.375365
4.320573
0.176779
8.530954
false
true
2023-07-26
2024-06-12
0
togethercomputer/LLaMA-2-7B-32K
togethercomputer_Llama-2-7B-32K-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/Llama-2-7B-32K-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/Llama-2-7B-32K-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__Llama-2-7B-32K-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/Llama-2-7B-32K-Instruct
d27380af003252f5eb0d218e104938b4e673e3f3
8.20819
llama2
159
7
true
false
false
false
0.589909
0.213
21.300039
0.344347
8.56347
0.01284
1.283988
0.251678
0.223714
0.405594
9.199219
0.178108
8.678709
false
true
2023-08-08
2024-06-12
0
togethercomputer/Llama-2-7B-32K-Instruct
togethercomputer_RedPajama-INCITE-7B-Base_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-7B-Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-7B-Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-7B-Base
78f7e482443971f4873ba3239f0ac810a367833b
5.486286
apache-2.0
94
7
true
false
false
false
1.220607
0.20823
20.822972
0.319489
5.087242
0.011329
1.132931
0.255034
0.671141
0.362
3.016667
0.119681
2.186761
false
true
2023-05-04
2024-06-12
0
togethercomputer/RedPajama-INCITE-7B-Base
togethercomputer_RedPajama-INCITE-7B-Chat_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-7B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-7B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-7B-Chat
47b94a739e2f3164b438501c8684acc5d5acc146
3.962784
apache-2.0
92
7
true
false
false
false
1.219336
0.155798
15.579773
0.317545
4.502174
0.001511
0.151057
0.252517
0.33557
0.34476
1.861719
0.112118
1.34641
false
true
2023-05-04
2024-06-13
0
togethercomputer/RedPajama-INCITE-7B-Chat
togethercomputer_RedPajama-INCITE-7B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-7B-Instruct
7f36397b9985a3f981cdb618f8fec1c565ca5927
6.356021
apache-2.0
103
7
true
false
false
false
1.181119
0.205507
20.550694
0.337744
7.905416
0.015106
1.510574
0.250839
0.111857
0.36851
5.030469
0.127244
3.027113
false
true
2023-05-05
2024-06-12
0
togethercomputer/RedPajama-INCITE-7B-Instruct
togethercomputer_RedPajama-INCITE-Base-3B-v1_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-Base-3B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-Base-3B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-Base-3B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-Base-3B-v1
094fbdd0c911feb485ce55de1952ab2e75277e1e
5.445562
apache-2.0
90
3
true
false
false
false
0.776102
0.229363
22.936254
0.30604
3.518608
0.009819
0.981873
0.243289
0
0.373875
4.001042
0.11112
1.235594
false
true
2023-05-04
2024-06-12
0
togethercomputer/RedPajama-INCITE-Base-3B-v1
togethercomputer_RedPajama-INCITE-Chat-3B-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-Chat-3B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-Chat-3B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-Chat-3B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-Chat-3B-v1
f0e0995eba801096ed04cb87931d96a8316871af
4.748119
apache-2.0
152
3
true
false
false
false
0.774909
0.165215
16.521496
0.321669
5.164728
0.003021
0.302115
0.244128
0
0.368448
5.089323
0.112699
1.411052
false
true
2023-05-05
2024-06-13
0
togethercomputer/RedPajama-INCITE-Chat-3B-v1
togethercomputer_RedPajama-INCITE-Instruct-3B-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/togethercomputer/RedPajama-INCITE-Instruct-3B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">togethercomputer/RedPajama-INCITE-Instruct-3B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/togethercomputer__RedPajama-INCITE-Instruct-3B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
togethercomputer/RedPajama-INCITE-Instruct-3B-v1
0c66778ee09a036886741707733620b91057909a
5.676527
apache-2.0
93
3
true
false
false
false
0.760671
0.212426
21.242636
0.314602
4.510786
0.006798
0.679758
0.247483
0
0.388604
6.408854
0.110954
1.217125
false
true
2023-05-05
2024-06-12
0
togethercomputer/RedPajama-INCITE-Instruct-3B-v1
tokyotech-llm_Llama-3-Swallow-8B-Instruct-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/tokyotech-llm/Llama-3-Swallow-8B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">tokyotech-llm/Llama-3-Swallow-8B-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/tokyotech-llm__Llama-3-Swallow-8B-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
tokyotech-llm/Llama-3-Swallow-8B-Instruct-v0.1
1fae784584dd03680b72dd4de7eefbc5b7cabcd5
22.307385
llama3
18
8.03
true
false
false
true
0.85811
0.550772
55.077195
0.500939
29.267966
0.072508
7.250755
0.28943
5.257271
0.435698
13.795573
0.30876
23.195553
false
false
2024-06-26
2024-09-12
0
tokyotech-llm/Llama-3-Swallow-8B-Instruct-v0.1
unsloth_Phi-3-mini-4k-instruct_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/unsloth/Phi-3-mini-4k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">unsloth/Phi-3-mini-4k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/unsloth__Phi-3-mini-4k-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
unsloth/Phi-3-mini-4k-instruct
636c707430a5509c80b1aa51d05c127ed339a975
27.178374
mit
41
3.821
true
false
false
true
0.469533
0.544028
54.402762
0.550024
36.732473
0.154079
15.407855
0.322987
9.731544
0.428417
13.11875
0.403092
33.676862
false
false
2024-04-29
2024-11-25
0
unsloth/Phi-3-mini-4k-instruct
unsloth_phi-4_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/unsloth/phi-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">unsloth/phi-4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/unsloth__phi-4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
unsloth/phi-4
682399cd249206f583fc19473d5a28af0a9bcea7
34.484598
mit
42
14.66
true
false
false
true
0.943269
0.688208
68.82084
0.688587
55.253145
0.125378
12.537764
0.336409
11.521253
0.411427
10.128385
0.537816
48.646203
false
false
2025-01-08
2025-01-09
1
microsoft/phi-4
unsloth_phi-4-bnb-4bit_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/unsloth/phi-4-bnb-4bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">unsloth/phi-4-bnb-4bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/unsloth__phi-4-bnb-4bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
unsloth/phi-4-bnb-4bit
85ca2925f3cc4f3c42de4168e9ba0695be5d5845
34.61689
mit
9
8.058
true
false
false
true
1.523872
0.672971
67.297105
0.676985
53.535199
0.194109
19.410876
0.338087
11.744966
0.400729
8.424479
0.525598
47.288712
false
false
2025-01-08
2025-01-09
1
microsoft/phi-4
unsloth_phi-4-unsloth-bnb-4bit_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/unsloth/phi-4-unsloth-bnb-4bit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">unsloth/phi-4-unsloth-bnb-4bit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/unsloth__phi-4-unsloth-bnb-4bit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
unsloth/phi-4-unsloth-bnb-4bit
227e8cbc0de0cd783703a3a2f217159a86041a5f
34.94908
mit
24
8.483
true
false
false
true
1.518768
0.679391
67.939068
0.679109
53.840081
0.200151
20.015106
0.336409
11.521253
0.403396
8.757813
0.52859
47.621158
false
false
2025-01-08
2025-01-09
1
microsoft/phi-4
upstage_SOLAR-10.7B-Instruct-v1.0_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/upstage/SOLAR-10.7B-Instruct-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">upstage/SOLAR-10.7B-Instruct-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/upstage__SOLAR-10.7B-Instruct-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
upstage/SOLAR-10.7B-Instruct-v1.0
c08c25ed66414a878fe0401a3596d536c083606c
19.628255
cc-by-nc-4.0
619
10.732
true
false
false
true
0.782776
0.473661
47.3661
0.516249
31.872402
0
0
0.308725
7.829978
0.389938
6.942188
0.31383
23.758865
false
true
2023-12-12
2024-06-12
1
upstage/SOLAR-10.7B-Instruct-v1.0 (Merge)
upstage_SOLAR-10.7B-v1.0_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/upstage/SOLAR-10.7B-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">upstage/SOLAR-10.7B-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/upstage__SOLAR-10.7B-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
upstage/SOLAR-10.7B-v1.0
a45090b8e56bdc2b8e32e46b3cd782fc0bea1fa5
4.916448
apache-2.0
295
10.732
true
false
false
false
1.519194
0.171585
17.158473
0.299835
2.147163
0.023414
2.34139
0.260906
1.454139
0.368198
4.52474
0.116855
1.872784
false
true
2023-12-12
2024-06-12
0
upstage/SOLAR-10.7B-v1.0
upstage_solar-pro-preview-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
SolarForCausalLM
<a target="_blank" href="https://huggingface.co/upstage/solar-pro-preview-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">upstage/solar-pro-preview-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/upstage__solar-pro-preview-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
upstage/solar-pro-preview-instruct
b4db141b5fb08b23f8bc323bc34e2cff3e9675f8
39.900891
mit
439
22.14
true
false
false
true
1.741763
0.841581
84.158145
0.681684
54.822351
0.218278
21.827795
0.370805
16.107383
0.441656
15.007031
0.527344
47.482639
false
true
2024-09-09
2024-09-11
0
upstage/solar-pro-preview-instruct
uukuguy_speechless-code-mistral-7b-v1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/uukuguy/speechless-code-mistral-7b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-code-mistral-7b-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-code-mistral-7b-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
uukuguy/speechless-code-mistral-7b-v1.0
1862e0a712efc6002112e9c1235a197d58419b37
18.091887
apache-2.0
18
7
true
false
false
false
0.646398
0.366524
36.652416
0.457171
24.091412
0.046073
4.607251
0.284396
4.58613
0.450177
14.772135
0.314578
23.841977
false
false
2023-10-10
2024-06-26
0
uukuguy/speechless-code-mistral-7b-v1.0
uukuguy_speechless-codellama-34b-v2.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/uukuguy/speechless-codellama-34b-v2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-codellama-34b-v2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-codellama-34b-v2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
uukuguy/speechless-codellama-34b-v2.0
419bc42a254102d6a5486a1a854068e912c4047c
17.209358
llama2
17
34
true
false
false
false
1.991254
0.460422
46.042168
0.481313
25.993293
0.043051
4.305136
0.269295
2.572707
0.378708
7.205208
0.254239
17.137633
false
false
2023-10-04
2024-06-26
0
uukuguy/speechless-codellama-34b-v2.0
uukuguy_speechless-coder-ds-6.7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/uukuguy/speechless-coder-ds-6.7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-coder-ds-6.7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-coder-ds-6.7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
uukuguy/speechless-coder-ds-6.7b
c813a5268c6dfe267a720ad3b51773f1ab0feb59
9.639323
apache-2.0
6
6.7
true
false
false
false
0.788604
0.25047
25.046986
0.403637
15.897457
0.016616
1.661631
0.264262
1.901566
0.381938
5.342188
0.171875
7.986111
false
false
2023-12-30
2024-06-26
0
uukuguy/speechless-coder-ds-6.7b
uukuguy_speechless-instruct-mistral-7b-v0.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/uukuguy/speechless-instruct-mistral-7b-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-instruct-mistral-7b-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-instruct-mistral-7b-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
uukuguy/speechless-instruct-mistral-7b-v0.2
87a4d214f7d028d61c3dc013a7410b3c34a24072
18.018597
apache-2.0
0
7.242
true
false
false
false
0.61762
0.326132
32.613244
0.460667
24.558747
0.043807
4.380665
0.281879
4.250559
0.490177
21.172135
0.290226
21.136229
false
false
2024-05-22
2024-06-26
0
uukuguy/speechless-instruct-mistral-7b-v0.2
uukuguy_speechless-llama2-hermes-orca-platypus-wizardlm-13b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-llama2-hermes-orca-platypus-wizardlm-13b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b
954cc87b0ed5fa280126de546daf648861031512
18.600891
32
13.016
false
false
false
false
0.979524
0.456175
45.617517
0.484554
26.791727
0.01435
1.435045
0.270134
2.684564
0.4655
17.754167
0.255901
17.322326
false
false
2023-09-01
2024-06-26
0
uukuguy/speechless-llama2-hermes-orca-platypus-wizardlm-13b
uukuguy_speechless-mistral-dolphin-orca-platypus-samantha-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/uukuguy/speechless-mistral-dolphin-orca-platypus-samantha-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-mistral-dolphin-orca-platypus-samantha-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-mistral-dolphin-orca-platypus-samantha-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
uukuguy/speechless-mistral-dolphin-orca-platypus-samantha-7b
b1de043468a15198b55a6509293a4ee585139043
18.340089
llama2
17
7.242
true
false
false
false
0.655719
0.370022
37.002154
0.498277
29.653129
0.029456
2.945619
0.283557
4.474273
0.436135
13.85026
0.299036
22.1151
false
false
2023-10-13
2024-06-26
0
uukuguy/speechless-mistral-dolphin-orca-platypus-samantha-7b
uukuguy_speechless-zephyr-code-functionary-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/uukuguy/speechless-zephyr-code-functionary-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">uukuguy/speechless-zephyr-code-functionary-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/uukuguy__speechless-zephyr-code-functionary-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
uukuguy/speechless-zephyr-code-functionary-7b
d66fc775ece679966e352195c42444e9c70af7fa
16.360129
apache-2.0
2
7.242
true
false
false
false
0.634
0.269579
26.957916
0.466428
25.983623
0.036254
3.625378
0.300336
6.711409
0.426771
11.613021
0.309425
23.26943
false
false
2024-01-23
2024-06-26
0
uukuguy/speechless-zephyr-code-functionary-7b
v000000_L3-8B-Stheno-v3.2-abliterated_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/v000000/L3-8B-Stheno-v3.2-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/L3-8B-Stheno-v3.2-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__L3-8B-Stheno-v3.2-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
v000000/L3-8B-Stheno-v3.2-abliterated
ddb17f127a1c068b105b79aadd76632615743f68
24.645788
8
8.03
false
false
false
true
0.500355
0.671772
67.177201
0.514144
30.746305
0.070997
7.099698
0.309564
7.941834
0.361969
5.979427
0.360372
28.93026
false
false
2024-07-09
2025-01-07
1
v000000/L3-8B-Stheno-v3.2-abliterated (Merge)
v000000_L3.1-Niitorm-8B-DPO-t0.0001_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/v000000/L3.1-Niitorm-8B-DPO-t0.0001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/L3.1-Niitorm-8B-DPO-t0.0001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__L3.1-Niitorm-8B-DPO-t0.0001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
v000000/L3.1-Niitorm-8B-DPO-t0.0001
a34150b5f63de4bc83d79b1de127faff3750289f
28.100642
7
8.03
false
false
false
true
0.878109
0.768867
76.886661
0.513423
30.513173
0.161631
16.163142
0.294463
5.928412
0.387979
7.264063
0.386636
31.848404
false
false
2024-09-19
2024-09-19
1
v000000/L3.1-Niitorm-8B-DPO-t0.0001 (Merge)
v000000_L3.1-Storniitova-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/v000000/L3.1-Storniitova-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/L3.1-Storniitova-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__L3.1-Storniitova-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
v000000/L3.1-Storniitova-8B
05b126857f43d1b1383e50f8c97d214ceb199723
28.281707
7
8.03
false
false
false
true
0.81354
0.781656
78.165601
0.515145
30.810993
0.146526
14.652568
0.28943
5.257271
0.402896
9.961979
0.377576
30.841829
false
false
2024-09-12
2024-09-18
1
v000000/L3.1-Storniitova-8B (Merge)
v000000_Qwen2.5-14B-Gutenberg-1e-Delta_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/v000000/Qwen2.5-14B-Gutenberg-1e-Delta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/Qwen2.5-14B-Gutenberg-1e-Delta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__Qwen2.5-14B-Gutenberg-1e-Delta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
v000000/Qwen2.5-14B-Gutenberg-1e-Delta
f624854b4380e01322e752ce4daadd49ac86580f
32.105096
apache-2.0
4
14.77
true
false
false
true
1.802387
0.804512
80.451203
0.63985
48.616672
0
0
0.328859
10.514541
0.407302
9.379427
0.493019
43.668735
false
false
2024-09-20
2024-09-28
1
v000000/Qwen2.5-14B-Gutenberg-1e-Delta (Merge)
v000000_Qwen2.5-14B-Gutenberg-Instruct-Slerpeno_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/v000000/Qwen2.5-14B-Gutenberg-Instruct-Slerpeno" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/Qwen2.5-14B-Gutenberg-Instruct-Slerpeno</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__Qwen2.5-14B-Gutenberg-Instruct-Slerpeno-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
v000000/Qwen2.5-14B-Gutenberg-Instruct-Slerpeno
1069abb4c25855e67ffaefa08a0befbb376e7ca7
32.487656
apache-2.0
5
14.77
true
false
false
true
3.788155
0.819749
81.974938
0.63901
48.452124
0
0
0.331376
10.850112
0.411365
10.053906
0.492354
43.594858
true
false
2024-09-20
2024-12-07
1
v000000/Qwen2.5-14B-Gutenberg-Instruct-Slerpeno (Merge)
v000000_Qwen2.5-Lumen-14B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/v000000/Qwen2.5-Lumen-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">v000000/Qwen2.5-Lumen-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/v000000__Qwen2.5-Lumen-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
v000000/Qwen2.5-Lumen-14B
fbb1d184ed01dac52d307737893ebb6b0ace444c
32.200288
apache-2.0
18
14.77
true
false
false
true
1.836693
0.80636
80.636046
0.639081
48.507861
0
0
0.32802
10.402685
0.411396
10.291146
0.490276
43.363992
false
false
2024-09-20
2024-09-20
1
v000000/Qwen2.5-Lumen-14B (Merge)
vhab10_Llama-3.1-8B-Base-Instruct-SLERP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vhab10/Llama-3.1-8B-Base-Instruct-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vhab10/Llama-3.1-8B-Base-Instruct-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vhab10__Llama-3.1-8B-Base-Instruct-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vhab10/Llama-3.1-8B-Base-Instruct-SLERP
eccb4bde0dc91f586954109ecdce7c94f47e2625
19.249617
mit
1
8.03
true
false
false
false
0.806721
0.290712
29.071198
0.505744
29.926042
0.11858
11.858006
0.296141
6.152125
0.401063
9.366146
0.362118
29.124187
true
false
2024-09-16
2024-09-29
1
vhab10/Llama-3.1-8B-Base-Instruct-SLERP (Merge)
vhab10_Llama-3.2-Instruct-3B-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vhab10/Llama-3.2-Instruct-3B-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vhab10/Llama-3.2-Instruct-3B-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vhab10__Llama-3.2-Instruct-3B-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vhab10/Llama-3.2-Instruct-3B-TIES
0e8661730f40a6a279bd273cfe9fe46bbd0507dd
17.296562
mit
0
1.848
true
false
false
false
1.122926
0.472737
47.273678
0.433236
19.183159
0.095921
9.592145
0.269295
2.572707
0.349656
3.873698
0.291556
21.283983
true
false
2024-10-06
2024-11-23
1
vhab10/Llama-3.2-Instruct-3B-TIES (Merge)
vhab10_llama-3-8b-merged-linear_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vhab10/llama-3-8b-merged-linear" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vhab10/llama-3-8b-merged-linear</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vhab10__llama-3-8b-merged-linear-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vhab10/llama-3-8b-merged-linear
c37e7671b5ccfadbf3065fa5b48af05cd4f13292
23.911368
mit
0
4.65
true
false
false
true
1.304943
0.591663
59.166345
0.493709
27.816051
0.081571
8.1571
0.299497
6.599553
0.419052
11.68151
0.370429
30.047651
false
false
2024-09-26
2024-09-26
1
vhab10/llama-3-8b-merged-linear (Merge)
vicgalle_CarbonBeagle-11B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/CarbonBeagle-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/CarbonBeagle-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__CarbonBeagle-11B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/CarbonBeagle-11B
3fe9bf5327606d013b182fed17a472f5f043759b
22.470186
apache-2.0
9
10.732
true
false
false
true
0.915379
0.54153
54.152981
0.529365
33.060604
0.061934
6.193353
0.302013
6.935123
0.402031
9.18724
0.327626
25.291814
true
false
2024-01-21
2024-06-26
1
vicgalle/CarbonBeagle-11B (Merge)
vicgalle_CarbonBeagle-11B-truthy_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/CarbonBeagle-11B-truthy" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/CarbonBeagle-11B-truthy</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__CarbonBeagle-11B-truthy-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/CarbonBeagle-11B-truthy
476cd2a6d938bddb38dfbeb4cb21e3e34303413d
21.357727
apache-2.0
10
10.732
true
false
false
true
0.907273
0.521221
52.122147
0.534842
33.988376
0.05136
5.135952
0.299497
6.599553
0.373969
4.11276
0.335688
26.187574
false
false
2024-02-10
2024-07-13
0
vicgalle/CarbonBeagle-11B-truthy
vicgalle_Configurable-Hermes-2-Pro-Llama-3-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/Configurable-Hermes-2-Pro-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Configurable-Hermes-2-Pro-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Configurable-Hermes-2-Pro-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/Configurable-Hermes-2-Pro-Llama-3-8B
3cb5792509966a963645be24fdbeb2e7dc6cac15
22.351954
apache-2.0
6
8.031
true
false
false
true
0.748927
0.576251
57.625101
0.505484
30.509625
0.063444
6.344411
0.29698
6.263982
0.418365
10.06224
0.309757
23.306368
false
false
2024-05-02
2024-07-24
2
NousResearch/Meta-Llama-3-8B
vicgalle_Configurable-Llama-3.1-8B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/Configurable-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Configurable-Llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Configurable-Llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/Configurable-Llama-3.1-8B-Instruct
133b3ab1a5385ff9b3d17da2addfe3fc1fd6f733
28.010111
apache-2.0
15
8.03
true
false
false
true
0.79661
0.83124
83.124
0.504476
29.661398
0.172961
17.296073
0.274329
3.243848
0.384542
5.934375
0.359209
28.800975
false
false
2024-07-24
2024-08-05
0
vicgalle/Configurable-Llama-3.1-8B-Instruct
vicgalle_Configurable-Yi-1.5-9B-Chat_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/Configurable-Yi-1.5-9B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Configurable-Yi-1.5-9B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Configurable-Yi-1.5-9B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/Configurable-Yi-1.5-9B-Chat
992cb2232caae78eff6a836b2e0642f7cbf6018e
23.972567
apache-2.0
2
8.829
true
false
false
true
0.941909
0.432345
43.234507
0.54522
35.334445
0.073263
7.326284
0.343121
12.416107
0.427115
12.022656
0.401513
33.501404
false
false
2024-05-12
2024-06-26
0
vicgalle/Configurable-Yi-1.5-9B-Chat
vicgalle_ConfigurableBeagle-11B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/ConfigurableBeagle-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/ConfigurableBeagle-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__ConfigurableBeagle-11B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/ConfigurableBeagle-11B
bbc16dbf94b8e8a99bb3e2ada6755faf9c2990dd
22.635544
apache-2.0
3
10.732
true
false
false
true
0.879857
0.583445
58.344526
0.528659
32.392023
0.043807
4.380665
0.302013
6.935123
0.395302
7.379427
0.337434
26.381501
false
false
2024-02-17
2024-06-26
0
vicgalle/ConfigurableBeagle-11B
vicgalle_ConfigurableHermes-7B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/ConfigurableHermes-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/ConfigurableHermes-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__ConfigurableHermes-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/ConfigurableHermes-7B
1333a88eaf6591836b2d9825d1eaec7260f336c9
19.536295
apache-2.0
3
7.242
true
false
false
true
0.617282
0.54108
54.107989
0.457297
23.158164
0.047583
4.758308
0.276846
3.579418
0.405688
9.110938
0.302527
22.502955
false
false
2024-02-17
2024-06-26
0
vicgalle/ConfigurableHermes-7B
vicgalle_ConfigurableSOLAR-10.7B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/ConfigurableSOLAR-10.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/ConfigurableSOLAR-10.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__ConfigurableSOLAR-10.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/ConfigurableSOLAR-10.7B
9d9baad88ea9dbaa61881f15e4f0d16e931033b4
19.045696
apache-2.0
2
10.732
true
false
false
true
0.677681
0.509956
50.995581
0.486681
27.45095
0
0
0.298658
6.487696
0.380479
5.193229
0.31732
24.14672
false
false
2024-03-10
2024-06-26
0
vicgalle/ConfigurableSOLAR-10.7B
vicgalle_Humanish-RP-Llama-3.1-8B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/Humanish-RP-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Humanish-RP-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Humanish-RP-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/Humanish-RP-Llama-3.1-8B
d27aa731db1d390a8d17b0a4565c9231ee5ae8b9
25.347671
apache-2.0
9
8.03
true
false
false
true
0.753451
0.666926
66.692598
0.510039
29.95856
0.147281
14.728097
0.286913
4.9217
0.395208
8.267708
0.347656
27.517361
false
false
2024-08-03
2024-08-03
0
vicgalle/Humanish-RP-Llama-3.1-8B
vicgalle_Merge-Mistral-Prometheus-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/Merge-Mistral-Prometheus-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Merge-Mistral-Prometheus-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Merge-Mistral-Prometheus-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/Merge-Mistral-Prometheus-7B
a7083581b508ce83c74f9267f07024bd462e7161
16.574054
apache-2.0
1
7.242
true
false
false
true
0.630356
0.484801
48.480144
0.42014
18.410406
0.017372
1.73716
0.263423
1.789709
0.41
9.95
0.271692
19.076906
true
false
2024-05-04
2024-06-26
1
vicgalle/Merge-Mistral-Prometheus-7B (Merge)
vicgalle_Merge-Mixtral-Prometheus-8x7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/Merge-Mixtral-Prometheus-8x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Merge-Mixtral-Prometheus-8x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Merge-Mixtral-Prometheus-8x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/Merge-Mixtral-Prometheus-8x7B
ba53ee5b52a81e56b01e919c069a0d045cfd4e83
24.794158
apache-2.0
2
46.703
true
true
false
true
3.674009
0.574403
57.440259
0.53515
34.651421
0.094411
9.441088
0.308725
7.829978
0.40975
9.585417
0.368351
29.816785
true
false
2024-05-04
2024-06-26
1
vicgalle/Merge-Mixtral-Prometheus-8x7B (Merge)
vicgalle_Roleplay-Llama-3-8B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vicgalle/Roleplay-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vicgalle/Roleplay-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vicgalle__Roleplay-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vicgalle/Roleplay-Llama-3-8B
57297eb57dcc2c116f061d9dda341094203da01b
24.083124
apache-2.0
37
8.03
true
false
false
true
1.126159
0.732022
73.202215
0.501232
28.554604
0.095166
9.516616
0.260906
1.454139
0.352885
1.677344
0.370844
30.093824
false
false
2024-04-19
2024-06-26
0
vicgalle/Roleplay-Llama-3-8B
vihangd_smart-dan-sft-v0.1_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vihangd/smart-dan-sft-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vihangd/smart-dan-sft-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vihangd__smart-dan-sft-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vihangd/smart-dan-sft-v0.1
924b4a09153d4061fa9d58f03b10cd7cde7e3084
3.783096
apache-2.0
0
0.379
true
false
false
false
0.361025
0.157646
15.764616
0.306177
3.125599
0.004532
0.453172
0.255034
0.671141
0.350188
1.106771
0.114195
1.577275
false
false
2024-08-09
2024-08-20
0
vihangd/smart-dan-sft-v0.1
voidful_smol-360m-ft_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/voidful/smol-360m-ft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">voidful/smol-360m-ft</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/voidful__smol-360m-ft-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
voidful/smol-360m-ft
3889a38fc79d2400997e01bf1d00c8059d72fead
4.739578
apache-2.0
0
0.362
true
false
false
true
0.381729
0.20131
20.13103
0.301195
3.022706
0.005287
0.528701
0.245805
0
0.371365
3.78724
0.10871
0.96779
false
false
2024-11-24
2024-11-28
1
voidful/smol-360m-ft (Merge)
vonjack_MobileLLM-125M-HF_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vonjack/MobileLLM-125M-HF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/MobileLLM-125M-HF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__MobileLLM-125M-HF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/MobileLLM-125M-HF
7664f5e1b91faa04fac545f64db84c26316c7e63
5.464647
cc-by-nc-4.0
0
0.125
true
false
false
false
0.171811
0.210728
21.072754
0.30273
3.146584
0.003021
0.302115
0.260067
1.342282
0.378187
5.106771
0.116356
1.817376
false
false
2024-11-15
2024-11-15
0
vonjack/MobileLLM-125M-HF
vonjack_Phi-3-mini-4k-instruct-LLaMAfied_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vonjack/Phi-3-mini-4k-instruct-LLaMAfied" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/Phi-3-mini-4k-instruct-LLaMAfied</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__Phi-3-mini-4k-instruct-LLaMAfied-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/Phi-3-mini-4k-instruct-LLaMAfied
96a48b8ea6f661f71ade001a0a2232b66ac38481
26.804435
mit
11
3.821
true
false
false
true
0.451115
0.578749
57.874883
0.574068
40.201852
0.128399
12.839879
0.330537
10.738255
0.392354
7.110938
0.388547
32.060801
false
false
2024-04-24
2025-01-03
0
vonjack/Phi-3-mini-4k-instruct-LLaMAfied
vonjack_Phi-3.5-mini-instruct-hermes-fc-json_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/vonjack/Phi-3.5-mini-instruct-hermes-fc-json" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/Phi-3.5-mini-instruct-hermes-fc-json</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__Phi-3.5-mini-instruct-hermes-fc-json-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/Phi-3.5-mini-instruct-hermes-fc-json
4cacfb35723647d408f0414886d0dfe67404a14f
4.516525
apache-2.0
1
4.132
true
false
false
true
1.285189
0.141584
14.158433
0.297476
2.390836
0
0
0.254195
0.559284
0.404135
8.45026
0.113863
1.540337
false
false
2024-11-05
2024-11-05
1
vonjack/Phi-3.5-mini-instruct-hermes-fc-json (Merge)
vonjack_Qwen2.5-Coder-0.5B-Merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/vonjack/Qwen2.5-Coder-0.5B-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/Qwen2.5-Coder-0.5B-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__Qwen2.5-Coder-0.5B-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/Qwen2.5-Coder-0.5B-Merged
38e4789c0fc5fad359de2f7bafdb65c3ae26b95b
6.350287
0
0.63
false
false
false
true
0.496779
0.309971
30.997088
0.307602
3.588738
0
0
0.253356
0.447427
0.330344
0.826302
0.12018
2.242169
false
false
2024-11-19
2024-11-19
1
vonjack/Qwen2.5-Coder-0.5B-Merged (Merge)
vonjack_SmolLM2-1.7B-Merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vonjack/SmolLM2-1.7B-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/SmolLM2-1.7B-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__SmolLM2-1.7B-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/SmolLM2-1.7B-Merged
232d54a335220b0d83d6036f6d8df3971d3e79bb
11.944703
0
1.711
false
false
false
true
0.311327
0.369797
36.979658
0.358655
10.76653
0.045317
4.531722
0.279362
3.914989
0.340792
3.832292
0.204787
11.643026
false
false
2024-11-18
2024-11-18
1
vonjack/SmolLM2-1.7B-Merged (Merge)
vonjack_SmolLM2-135M-Merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vonjack/SmolLM2-135M-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/SmolLM2-135M-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__SmolLM2-135M-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/SmolLM2-135M-Merged
a1700ca913a87ad713edfe57a2030a9d7c088970
5.73396
0
0.135
false
false
false
true
0.34551
0.248297
24.829674
0.309993
4.587041
0.003021
0.302115
0.238255
0
0.366187
3.440104
0.111203
1.244829
false
false
2024-11-15
2024-11-15
1
vonjack/SmolLM2-135M-Merged (Merge)
vonjack_SmolLM2-360M-Merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/vonjack/SmolLM2-360M-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">vonjack/SmolLM2-360M-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/vonjack__SmolLM2-360M-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
vonjack/SmolLM2-360M-Merged
32bceedf56b29a4a9fdd459a36fbc7fae5e274c8
7.130731
0
0.362
false
false
false
true
0.385742
0.320587
32.058715
0.315485
4.741734
0.007553
0.755287
0.255872
0.782998
0.352729
3.357813
0.109791
1.08784
false
false
2024-11-15
2024-11-15
1
vonjack/SmolLM2-360M-Merged (Merge)
w4r10ck_SOLAR-10.7B-Instruct-v1.0-uncensored_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/w4r10ck__SOLAR-10.7B-Instruct-v1.0-uncensored-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored
baa7b3899e85af4b2f02b01fd93f203872140d27
20.577181
apache-2.0
30
10.732
true
false
false
false
0.801971
0.388406
38.84061
0.530153
33.858639
0.003021
0.302115
0.294463
5.928412
0.463948
18.49349
0.334358
26.03982
false
false
2023-12-14
2024-10-11
0
w4r10ck/SOLAR-10.7B-Instruct-v1.0-uncensored
wannaphong_KhanomTanLLM-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/wannaphong/KhanomTanLLM-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">wannaphong/KhanomTanLLM-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/wannaphong__KhanomTanLLM-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
wannaphong/KhanomTanLLM-Instruct
351239c92c0ff3304d1dd98fdf4ac054a8c1acc3
4.617874
apache-2.0
2
3.447
true
false
false
true
0.401731
0.162118
16.211763
0.309312
3.944866
0.001511
0.151057
0.263423
1.789709
0.370062
4.291146
0.111868
1.318706
false
false
2024-08-24
2024-08-29
0
wannaphong/KhanomTanLLM-Instruct
waqasali1707_Beast-Soul-new_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/waqasali1707/Beast-Soul-new" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">waqasali1707/Beast-Soul-new</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/waqasali1707__Beast-Soul-new-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
waqasali1707/Beast-Soul-new
a23d68c4556d91a129de3f8fd8b9e0ff0890f4cc
22.108388
0
7.242
false
false
false
false
0.636888
0.502987
50.298652
0.522495
33.044262
0.070242
7.024169
0.282718
4.362416
0.448563
14.503646
0.310755
23.417184
false
false
2024-08-07
2024-08-07
1
waqasali1707/Beast-Soul-new (Merge)
wave-on-discord_qwent-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/wave-on-discord/qwent-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">wave-on-discord/qwent-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/wave-on-discord__qwent-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
wave-on-discord/qwent-7b
40000e76d2a4d0ad054aff9fe873c5beb0e4925e
8.734093
0
7.616
false
false
false
false
1.323496
0.201485
20.148539
0.42281
18.066398
0
0
0.265101
2.013423
0.381656
5.473698
0.160322
6.702497
false
false
2024-09-30
2024-09-30
1
wave-on-discord/qwent-7b (Merge)
win10_ArliAI-RPMax-v1.3-merge-13.3B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/win10/ArliAI-RPMax-v1.3-merge-13.3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/ArliAI-RPMax-v1.3-merge-13.3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__ArliAI-RPMax-v1.3-merge-13.3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/ArliAI-RPMax-v1.3-merge-13.3B
4d3ed351827f1afc1652e13aafeb1eae79b8f562
16.456101
0
13.265
false
false
false
true
1.451305
0.303826
30.382607
0.458139
23.0298
0.034743
3.47432
0.274329
3.243848
0.43251
14.163802
0.31998
24.442228
false
false
2024-11-16
2024-11-17
1
win10/ArliAI-RPMax-v1.3-merge-13.3B (Merge)
win10_Breeze-13B-32k-Instruct-v1_0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/win10/Breeze-13B-32k-Instruct-v1_0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/Breeze-13B-32k-Instruct-v1_0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__Breeze-13B-32k-Instruct-v1_0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/Breeze-13B-32k-Instruct-v1_0
220c957cf5d9c534a4ef75c11a18221c461de40a
15.411206
apache-2.0
0
12.726
true
false
false
true
1.448811
0.358431
35.843118
0.461123
25.258699
0.009819
0.981873
0.264262
1.901566
0.420198
11.058073
0.256815
17.423907
true
false
2024-06-26
2024-06-26
0
win10/Breeze-13B-32k-Instruct-v1_0
win10_EVA-Norns-Qwen2.5-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/win10/EVA-Norns-Qwen2.5-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/EVA-Norns-Qwen2.5-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__EVA-Norns-Qwen2.5-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/EVA-Norns-Qwen2.5-v0.1
90c3ca66e700b4a7d2c509634f9b9748a2e4c3ab
24.657872
1
7.616
false
false
false
true
0.656661
0.621963
62.196306
0.507241
30.060942
0.154834
15.483384
0.285235
4.697987
0.40451
8.563802
0.342503
26.944814
false
false
2024-11-17
2024-11-18
1
win10/EVA-Norns-Qwen2.5-v0.1 (Merge)
win10_Llama-3.2-3B-Instruct-24-9-29_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/win10/Llama-3.2-3B-Instruct-24-9-29" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/Llama-3.2-3B-Instruct-24-9-29</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__Llama-3.2-3B-Instruct-24-9-29-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/Llama-3.2-3B-Instruct-24-9-29
4defb10e2415111abb873d695dd40c387c1d6d57
23.929169
llama3.2
0
3.213
true
false
false
true
0.713606
0.733221
73.322119
0.461423
24.196426
0.166163
16.616314
0.274329
3.243848
0.355521
1.440104
0.322806
24.756206
false
false
2024-09-29
2024-10-11
2
meta-llama/Llama-3.2-3B-Instruct
win10_Norns-Qwen2.5-12B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/win10/Norns-Qwen2.5-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/Norns-Qwen2.5-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__Norns-Qwen2.5-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/Norns-Qwen2.5-12B
464793295c8633a95e6faedad24dfa8f0fd35663
16.386375
1
12.277
false
false
false
true
1.622972
0.489697
48.969734
0.461892
23.769257
0.004532
0.453172
0.283557
4.474273
0.35549
2.202865
0.266041
18.448951
false
false
2024-11-17
2024-11-17
1
win10/Norns-Qwen2.5-12B (Merge)
win10_Norns-Qwen2.5-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/win10/Norns-Qwen2.5-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/Norns-Qwen2.5-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__Norns-Qwen2.5-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/Norns-Qwen2.5-7B
148d9156f734a8050812892879cf13d1ca01f137
24.593277
0
7.616
false
false
false
true
0.649914
0.612221
61.222113
0.507289
30.250415
0.155589
15.558912
0.284396
4.58613
0.408479
9.126563
0.34134
26.815529
false
false
2024-11-17
2024-11-18
1
win10/Norns-Qwen2.5-7B (Merge)
win10_Qwen2.5-2B-Instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/win10/Qwen2.5-2B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/Qwen2.5-2B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__Qwen2.5-2B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/Qwen2.5-2B-Instruct
6cc7fca3447d50772978d2d7dec255abdc73d54b
10.21821
1
2.9
false
false
false
false
1.026092
0.227289
22.728915
0.370591
12.071946
0.001511
0.151057
0.267617
2.348993
0.437844
13.630469
0.193401
10.377881
false
false
2024-10-11
2024-12-20
1
win10/Qwen2.5-2B-Instruct (Merge)
win10_llama3-13.45b-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/win10/llama3-13.45b-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">win10/llama3-13.45b-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/win10__llama3-13.45b-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
win10/llama3-13.45b-Instruct
94cc0f415e355c6d3d47168a6ff5239ca586904a
17.277282
llama3
1
13.265
true
false
false
true
2.136535
0.414435
41.443481
0.486542
26.67569
0.020393
2.039275
0.258389
1.118568
0.38476
6.328385
0.334525
26.058289
true
false
2024-06-09
2024-06-26
1
win10/llama3-13.45b-Instruct (Merge)