eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 6
values | T
stringclasses 6
values | Weight type
stringclasses 3
values | Architecture
stringclasses 59
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 1.03
52
| Hub License
stringclasses 25
values | Hub ❤️
int64 0
5.96k
| #Params (B)
float64 -1
141
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.03
107
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.27
0.75
| BBH
float64 0.81
63.5
| MATH Lvl 5 Raw
float64 0
0.51
| MATH Lvl 5
float64 0
50.7
| GPQA Raw
float64 0.22
0.44
| GPQA
float64 0
24.9
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.5
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 457
values | Submission Date
stringclasses 200
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
winglian_Llama-3-8b-64k-PoSE_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/winglian/Llama-3-8b-64k-PoSE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">winglian/Llama-3-8b-64k-PoSE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/winglian__Llama-3-8b-64k-PoSE-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | winglian/Llama-3-8b-64k-PoSE | 5481d9b74a3ec5a95789673e194c8ff86e2bc2bc | 11.004738 | 75 | 8.03 | false | false | false | true | 0.911021 | 0.285691 | 28.569086 | 0.370218 | 13.307317 | 0.033233 | 3.323263 | 0.260906 | 1.454139 | 0.339552 | 3.077344 | 0.246676 | 16.297281 | false | false | 2024-04-24 | 2024-06-26 | 0 | winglian/Llama-3-8b-64k-PoSE |
|
winglian_llama-3-8b-256k-PoSE_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/winglian/llama-3-8b-256k-PoSE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">winglian/llama-3-8b-256k-PoSE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/winglian__llama-3-8b-256k-PoSE-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | winglian/llama-3-8b-256k-PoSE | 93e7b0b6433c96583ffcef3bc47203e6fdcbbe8b | 6.557715 | 42 | 8.03 | false | false | false | true | 1.050723 | 0.290911 | 29.091145 | 0.315658 | 5.502849 | 0.015106 | 1.510574 | 0.25755 | 1.006711 | 0.331552 | 0.94401 | 0.111619 | 1.291002 | false | false | 2024-04-26 | 2024-06-26 | 0 | winglian/llama-3-8b-256k-PoSE |
|
wzhouad_gemma-2-9b-it-WPO-HB_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/wzhouad/gemma-2-9b-it-WPO-HB" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">wzhouad/gemma-2-9b-it-WPO-HB</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/wzhouad__gemma-2-9b-it-WPO-HB-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | wzhouad/gemma-2-9b-it-WPO-HB | 5934cb2faf589341e96e2e79aec82b2d4b7be252 | 22.422181 | 32 | 9.242 | false | false | false | true | 2.589947 | 0.543703 | 54.370293 | 0.562862 | 36.661696 | 0 | 0 | 0.349832 | 13.310962 | 0.367458 | 3.965625 | 0.336021 | 26.224512 | false | false | 2024-08-08 | 2025-01-07 | 2 | google/gemma-2-9b |
|
xMaulana_FinMatcha-3B-Instruct_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xMaulana/FinMatcha-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xMaulana/FinMatcha-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xMaulana__FinMatcha-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xMaulana/FinMatcha-3B-Instruct | be2c0c04fc4dc3fb93631e3c663721da92fea8fc | 24.016243 | apache-2.0 | 0 | 3.213 | true | false | false | true | 6.577035 | 0.754828 | 75.48283 | 0.453555 | 23.191023 | 0.135952 | 13.595166 | 0.269295 | 2.572707 | 0.363333 | 5.016667 | 0.318152 | 24.239066 | false | false | 2024-09-29 | 2024-10-22 | 1 | xMaulana/FinMatcha-3B-Instruct (Merge) |
xinchen9_Llama3.1_8B_Instruct_CoT_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xinchen9/Llama3.1_8B_Instruct_CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/Llama3.1_8B_Instruct_CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__Llama3.1_8B_Instruct_CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xinchen9/Llama3.1_8B_Instruct_CoT | cab1b33ddff08de11c5daea8ae079d126d503d8b | 16.190743 | apache-2.0 | 0 | 8.03 | true | false | false | false | 1.856552 | 0.297357 | 29.735657 | 0.439821 | 21.142866 | 0.05287 | 5.287009 | 0.302013 | 6.935123 | 0.437062 | 13.166146 | 0.287899 | 20.87766 | false | false | 2024-09-16 | 2024-09-19 | 0 | xinchen9/Llama3.1_8B_Instruct_CoT |
xinchen9_Llama3.1_CoT_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xinchen9/Llama3.1_CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/Llama3.1_CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__Llama3.1_CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xinchen9/Llama3.1_CoT | 3cb467f51a59ff163bb942fcde3ef60573c12b79 | 13.351283 | apache-2.0 | 0 | 8.03 | true | false | false | true | 0.950099 | 0.224616 | 22.461624 | 0.434101 | 19.899124 | 0.015106 | 1.510574 | 0.288591 | 5.145414 | 0.430458 | 11.773958 | 0.273853 | 19.317007 | false | false | 2024-09-04 | 2024-09-06 | 0 | xinchen9/Llama3.1_CoT |
xinchen9_Llama3.1_CoT_V1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xinchen9/Llama3.1_CoT_V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/Llama3.1_CoT_V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__Llama3.1_CoT_V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xinchen9/Llama3.1_CoT_V1 | c5ed4b8bfc364ebae1843af14799818551f5251f | 14.394947 | apache-2.0 | 0 | 8.03 | true | false | false | false | 1.873462 | 0.245299 | 24.529914 | 0.4376 | 20.166003 | 0.01284 | 1.283988 | 0.279362 | 3.914989 | 0.457219 | 16.41901 | 0.280502 | 20.055777 | false | false | 2024-09-06 | 2024-09-07 | 0 | xinchen9/Llama3.1_CoT_V1 |
xinchen9_Mistral-7B-CoT_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/xinchen9/Mistral-7B-CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/Mistral-7B-CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__Mistral-7B-CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xinchen9/Mistral-7B-CoT | 9a3c8103dac20d5497d1b8fc041bb5125ff4dc00 | 11.202955 | apache-2.0 | 0 | 7.242 | true | false | false | false | 1.888689 | 0.279871 | 27.987074 | 0.387268 | 14.806193 | 0.019637 | 1.963746 | 0.249161 | 0 | 0.399427 | 8.195052 | 0.228391 | 14.265662 | false | false | 2024-09-09 | 2024-09-23 | 0 | xinchen9/Mistral-7B-CoT |
xinchen9_llama3-b8-ft-dis_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xinchen9/llama3-b8-ft-dis" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xinchen9/llama3-b8-ft-dis</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xinchen9__llama3-b8-ft-dis-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xinchen9/llama3-b8-ft-dis | e4da730f28f79543262de37908943c35f8df81fe | 13.897963 | apache-2.0 | 0 | 8.03 | true | false | false | false | 1.062327 | 0.154599 | 15.459869 | 0.462579 | 24.727457 | 0.034743 | 3.47432 | 0.312919 | 8.389262 | 0.365375 | 6.405208 | 0.324385 | 24.931664 | false | false | 2024-06-28 | 2024-07-11 | 0 | xinchen9/llama3-b8-ft-dis |
xkp24_Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table | c083d6796f54f66b4cec2261657a02801c761093 | 22.421029 | 0 | 8.03 | false | false | false | true | 0.624231 | 0.637475 | 63.747523 | 0.491227 | 27.422821 | 0.067976 | 6.797583 | 0.259228 | 1.230425 | 0.382 | 5.483333 | 0.3686 | 29.844489 | false | false | 2024-09-30 | 2024-10-01 | 0 | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_2b-table |
|
xkp24_Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table | 5416d34b5243559914a377ee9d95ce4830bf8dba | 24.502405 | 0 | 8.03 | false | false | false | true | 0.750264 | 0.727451 | 72.745094 | 0.505686 | 29.398353 | 0.084592 | 8.459215 | 0.260067 | 1.342282 | 0.381906 | 5.104948 | 0.369681 | 29.964539 | false | false | 2024-09-30 | 2024-10-01 | 0 | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_bt_8b-table |
|
xkp24_Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table | 235204157d7fac0d64fa609d5aee3cebb49ccd11 | 22.236354 | 0 | 8.03 | false | false | false | true | 0.671741 | 0.656859 | 65.685936 | 0.495183 | 27.6952 | 0.064955 | 6.495468 | 0.259228 | 1.230425 | 0.359396 | 2.291146 | 0.37018 | 30.019947 | false | false | 2024-09-30 | 2024-09-30 | 0 | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_2b-table |
|
xkp24_Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table | 9db00cbbba84453b18956fcc76f264f94a205955 | 22.935265 | 0 | 8.03 | false | false | false | true | 0.719228 | 0.66208 | 66.207995 | 0.500449 | 28.508587 | 0.077795 | 7.779456 | 0.259228 | 1.230425 | 0.380542 | 5.001042 | 0.359957 | 28.884087 | false | false | 2024-09-30 | 2024-09-30 | 0 | xkp24/Llama-3-8B-Instruct-SPPO-Iter2_gp_8b-table |
|
xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001 | 1062757826de031a4ae82277e6e737e19e82e514 | 21.845481 | 0 | 8.03 | false | false | false | true | 0.615003 | 0.604228 | 60.422789 | 0.493606 | 27.613714 | 0.064955 | 6.495468 | 0.259228 | 1.230425 | 0.379333 | 5.216667 | 0.370844 | 30.093824 | false | false | 2024-09-30 | 2024-10-01 | 0 | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_2b-table-0.001 |
|
xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002 | e5d2f179b4a7bd851dcf2b7db6358b13001bf1af | 23.938825 | 0 | 8.03 | false | false | false | true | 0.841468 | 0.713188 | 71.318768 | 0.499638 | 28.574879 | 0.069486 | 6.94864 | 0.258389 | 1.118568 | 0.387208 | 6.067708 | 0.366439 | 29.604388 | false | false | 2024-09-30 | 2024-10-01 | 0 | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_bt_8b-table-0.002 |
|
xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001 | 0e319ad47ed2b2636b72d07ee9b32657e1e50412 | 21.224624 | 0 | 8.03 | false | false | false | true | 0.679841 | 0.594711 | 59.471092 | 0.489922 | 26.943904 | 0.073263 | 7.326284 | 0.259228 | 1.230425 | 0.358094 | 2.328385 | 0.370429 | 30.047651 | false | false | 2024-09-30 | 2024-09-30 | 0 | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_2b-table-0.001 |
|
xkp24_Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xkp24__Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002 | 0877f2458ea667edcf9213383df41294c788190f | 22.69358 | 0 | 8.03 | false | false | false | true | 0.769119 | 0.645319 | 64.531887 | 0.495108 | 28.046978 | 0.067976 | 6.797583 | 0.260067 | 1.342282 | 0.393875 | 7.334375 | 0.352975 | 28.108378 | false | false | 2024-09-30 | 2024-10-01 | 0 | xkp24/Llama-3-8B-Instruct-SPPO-score-Iter2_gp_8b-table-0.002 |
|
xukp20_Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table | d2b87100e5ba3215fddbd308bb17b7bf12fe6c9e | 21.01778 | 0 | 8.03 | false | false | false | true | 0.98643 | 0.575602 | 57.560163 | 0.490121 | 26.866404 | 0.079305 | 7.930514 | 0.259228 | 1.230425 | 0.365969 | 2.979427 | 0.365858 | 29.539746 | false | false | 2024-09-28 | 2024-09-29 | 0 | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_2b-table |
|
xukp20_Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table | 19a48ccf5ea463afbbbc61d650b8fb63ff2d94c7 | 23.969226 | 0 | 8.03 | false | false | false | true | 0.590153 | 0.703446 | 70.344575 | 0.509187 | 29.731239 | 0.086858 | 8.685801 | 0.259228 | 1.230425 | 0.373906 | 3.904948 | 0.369265 | 29.918366 | false | false | 2024-09-28 | 2024-09-29 | 0 | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_bt_8b-table |
|
xukp20_Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table | 0fe230b3432fb2b0f89942d7926291a4dbeb2820 | 21.781466 | 0 | 8.03 | false | false | false | true | 0.665521 | 0.602379 | 60.237946 | 0.496953 | 27.892403 | 0.086103 | 8.610272 | 0.259228 | 1.230425 | 0.367365 | 3.18724 | 0.365775 | 29.530511 | false | false | 2024-09-28 | 2024-09-29 | 0 | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_2b-table |
|
xukp20_Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table | d1e19da1029f2d4d45de015754bc52dcb1ea5570 | 23.059714 | 0 | 8.03 | false | false | false | true | 0.588419 | 0.66203 | 66.203008 | 0.499994 | 28.439824 | 0.083082 | 8.308157 | 0.259228 | 1.230425 | 0.381812 | 5.126562 | 0.361453 | 29.05031 | false | false | 2024-09-28 | 2024-09-29 | 0 | xukp20/Llama-3-8B-Instruct-SPPO-Iter3_gp_8b-table |
|
xukp20_Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001 | a478aa202c59773eba615ae37feb4cc750757695 | 20.364052 | 0 | 8.03 | false | false | false | true | 0.586443 | 0.533636 | 53.363631 | 0.491487 | 27.145374 | 0.06571 | 6.570997 | 0.259228 | 1.230425 | 0.377969 | 4.71276 | 0.36245 | 29.161126 | false | false | 2024-09-28 | 2024-09-29 | 0 | xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_2b-table-0.001 |
|
xukp20_Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002 | 8ef9ef7e2bf522e707a7b090af55f2ec1eafd4b9 | 23.261322 | 0 | 8.03 | false | false | false | true | 0.869474 | 0.685161 | 68.516093 | 0.507516 | 29.74055 | 0.054381 | 5.438066 | 0.258389 | 1.118568 | 0.383177 | 5.630469 | 0.362118 | 29.124187 | false | false | 2024-09-28 | 2024-09-29 | 0 | xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_bt_8b-table-0.002 |
|
xukp20_Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001 | 86673872245ad902f8d466bdc20edae9c115b965 | 20.032169 | 0 | 8.03 | false | false | false | true | 0.675094 | 0.548224 | 54.822427 | 0.488717 | 26.839803 | 0.044562 | 4.456193 | 0.260906 | 1.454139 | 0.363271 | 2.942187 | 0.367104 | 29.678265 | false | false | 2024-09-28 | 2024-09-29 | 0 | xukp20/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_2b-table-0.001 |
|
xukp20_llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xukp20__llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table | abb3afe2b0398b24ed823b0124c8a72d354487bd | 23.498955 | 0 | 8.03 | false | false | false | true | 1.379342 | 0.690931 | 69.093117 | 0.497846 | 28.119887 | 0.0929 | 9.29003 | 0.259228 | 1.230425 | 0.367333 | 3.083333 | 0.371592 | 30.176936 | false | false | 2024-09-22 | 2024-09-23 | 0 | xukp20/llama-3-8b-instruct-sppo-iter1-gp-2b-tau01-table |
|
xxx777xxxASD_L3.1-ClaudeMaid-4x8B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/xxx777xxxASD/L3.1-ClaudeMaid-4x8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">xxx777xxxASD/L3.1-ClaudeMaid-4x8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/xxx777xxxASD__L3.1-ClaudeMaid-4x8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | xxx777xxxASD/L3.1-ClaudeMaid-4x8B | 2a98d9cb91c7aa775acbf5bfe7bb91beb2faf682 | 26.190883 | llama3.1 | 7 | 24.942 | true | true | false | true | 2.376185 | 0.669649 | 66.964875 | 0.507085 | 29.437348 | 0.128399 | 12.839879 | 0.291107 | 5.480984 | 0.428937 | 13.750521 | 0.358045 | 28.67169 | false | false | 2024-07-27 | 2024-07-28 | 0 | xxx777xxxASD/L3.1-ClaudeMaid-4x8B |
yam-peleg_Hebrew-Gemma-11B-Instruct_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | GemmaForCausalLM | <a target="_blank" href="https://huggingface.co/yam-peleg/Hebrew-Gemma-11B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yam-peleg/Hebrew-Gemma-11B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yam-peleg__Hebrew-Gemma-11B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yam-peleg/Hebrew-Gemma-11B-Instruct | a40259d1efbcac4829ed44d3b589716f615ed362 | 13.919763 | other | 22 | 10.475 | true | false | false | true | 1.937267 | 0.302077 | 30.207738 | 0.403578 | 16.862741 | 0.057402 | 5.740181 | 0.276007 | 3.467562 | 0.408854 | 9.973438 | 0.255402 | 17.266918 | false | false | 2024-03-06 | 2024-07-31 | 0 | yam-peleg/Hebrew-Gemma-11B-Instruct |
yam-peleg_Hebrew-Mistral-7B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/yam-peleg/Hebrew-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yam-peleg/Hebrew-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yam-peleg__Hebrew-Mistral-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yam-peleg/Hebrew-Mistral-7B | 3d32134b5959492fd7efbbf16395352594bc89f7 | 13.302117 | apache-2.0 | 65 | 7.504 | true | false | false | false | 1.399281 | 0.232834 | 23.283443 | 0.433404 | 20.17694 | 0.049849 | 4.984894 | 0.279362 | 3.914989 | 0.397656 | 7.673698 | 0.278009 | 19.778738 | false | false | 2024-04-26 | 2024-07-11 | 0 | yam-peleg/Hebrew-Mistral-7B |
yam-peleg_Hebrew-Mistral-7B-200K_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/yam-peleg/Hebrew-Mistral-7B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yam-peleg/Hebrew-Mistral-7B-200K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yam-peleg__Hebrew-Mistral-7B-200K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yam-peleg/Hebrew-Mistral-7B-200K | 7b51c7b31e3d9e29ea964c579a45233cfad255fe | 10.644291 | apache-2.0 | 15 | 7.504 | true | false | false | false | 0.735312 | 0.185573 | 18.557317 | 0.414927 | 17.493603 | 0.023414 | 2.34139 | 0.276007 | 3.467562 | 0.376479 | 4.526563 | 0.257314 | 17.479314 | false | false | 2024-05-05 | 2024-07-11 | 0 | yam-peleg/Hebrew-Mistral-7B-200K |
yam-peleg_Hebrew-Mistral-7B-200K_bfloat16 | bfloat16 | 🟩 continuously pretrained | 🟩 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/yam-peleg/Hebrew-Mistral-7B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yam-peleg/Hebrew-Mistral-7B-200K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yam-peleg__Hebrew-Mistral-7B-200K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yam-peleg/Hebrew-Mistral-7B-200K | 7b51c7b31e3d9e29ea964c579a45233cfad255fe | 8.235612 | apache-2.0 | 15 | 7.504 | true | false | false | true | 1.684494 | 0.17698 | 17.698041 | 0.34105 | 7.671324 | 0.021903 | 2.190332 | 0.253356 | 0.447427 | 0.374 | 4.416667 | 0.252909 | 16.989879 | false | false | 2024-05-05 | 2024-08-06 | 0 | yam-peleg/Hebrew-Mistral-7B-200K |
ycros_BagelMIsteryTour-v2-8x7B_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/ycros/BagelMIsteryTour-v2-8x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ycros/BagelMIsteryTour-v2-8x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ycros__BagelMIsteryTour-v2-8x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ycros/BagelMIsteryTour-v2-8x7B | 98a8b319707be3dab1659594da69a37ed8f8c148 | 24.258614 | cc-by-nc-4.0 | 16 | 46.703 | true | false | false | true | 3.649132 | 0.599432 | 59.943173 | 0.515924 | 31.699287 | 0.07855 | 7.854985 | 0.30453 | 7.270694 | 0.420292 | 11.303125 | 0.347324 | 27.480423 | true | false | 2024-01-19 | 2024-06-28 | 1 | ycros/BagelMIsteryTour-v2-8x7B (Merge) |
ycros_BagelMIsteryTour-v2-8x7B_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/ycros/BagelMIsteryTour-v2-8x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ycros/BagelMIsteryTour-v2-8x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ycros__BagelMIsteryTour-v2-8x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ycros/BagelMIsteryTour-v2-8x7B | 98a8b319707be3dab1659594da69a37ed8f8c148 | 24.724802 | cc-by-nc-4.0 | 16 | 46.703 | true | false | false | true | 3.619337 | 0.62621 | 62.620957 | 0.514194 | 31.366123 | 0.087613 | 8.761329 | 0.307886 | 7.718121 | 0.41375 | 10.31875 | 0.348072 | 27.563534 | true | false | 2024-01-19 | 2024-08-04 | 1 | ycros/BagelMIsteryTour-v2-8x7B (Merge) |
yfzp_Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table | 97b2d0e790a6fcdf39c34a2043f0818368c7dcb3 | 22.974571 | 0 | 8.03 | false | false | false | true | 0.618253 | 0.670898 | 67.089766 | 0.498661 | 28.170107 | 0.073263 | 7.326284 | 0.259228 | 1.230425 | 0.372698 | 3.853906 | 0.371592 | 30.176936 | false | false | 2024-09-29 | 2024-09-30 | 0 | yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_2b-table |
|
yfzp_Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table | e8786291c206d5cd1b01d29466e3b397278f4e2b | 24.877776 | 0 | 8.03 | false | false | false | true | 0.640663 | 0.733271 | 73.327105 | 0.508036 | 29.308128 | 0.097432 | 9.743202 | 0.260067 | 1.342282 | 0.380604 | 5.008854 | 0.374834 | 30.537086 | false | false | 2024-09-29 | 2024-09-30 | 0 | yfzp/Llama-3-8B-Instruct-SPPO-Iter1_bt_8b-table |
|
yfzp_Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table | 0d9cb29aa87b0c17ed011ffbc83803f3f6dd18e7 | 23.168114 | 0 | 8.03 | false | false | false | true | 0.679554 | 0.678466 | 67.846647 | 0.494121 | 27.469588 | 0.095166 | 9.516616 | 0.259228 | 1.230425 | 0.364667 | 2.75 | 0.371759 | 30.195405 | false | false | 2024-09-29 | 2024-09-29 | 0 | yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_2b-table |
|
yfzp_Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table | 7a326a956e6169b287a04ef93cdc0342a0f3311a | 24.001677 | 0 | 8.03 | false | false | false | true | 0.648184 | 0.713188 | 71.318768 | 0.502536 | 28.604424 | 0.093656 | 9.365559 | 0.259228 | 1.230425 | 0.371333 | 3.683333 | 0.368268 | 29.80755 | false | false | 2024-09-29 | 2024-09-29 | 0 | yfzp/Llama-3-8B-Instruct-SPPO-Iter1_gp_8b-table |
|
yfzp_Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001 | e5c8baadbf6ce17b344596ad42bd3546f66e253e | 22.364867 | 0 | 8.03 | false | false | false | true | 0.582235 | 0.649565 | 64.956538 | 0.497946 | 28.099199 | 0.048338 | 4.833837 | 0.259228 | 1.230425 | 0.377969 | 4.846094 | 0.372008 | 30.223109 | false | false | 2024-09-29 | 2024-09-30 | 0 | yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_2b-table-0.001 |
|
yfzp_Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002 | 064e237b850151938caf171a4c8c7e34c93e580e | 24.319539 | 0 | 8.03 | false | false | false | true | 0.606022 | 0.719607 | 71.960731 | 0.504515 | 28.785911 | 0.07855 | 7.854985 | 0.260067 | 1.342282 | 0.383146 | 5.593229 | 0.373421 | 30.380098 | false | false | 2024-09-29 | 2024-09-30 | 0 | yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_bt_8b-table-0.002 |
|
yfzp_Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001 | b685b90063258e05f8b4930fdbce2e565f13f620 | 22.384837 | 0 | 8.03 | false | false | false | true | 0.649092 | 0.65044 | 65.043972 | 0.495788 | 27.825253 | 0.073263 | 7.326284 | 0.259228 | 1.230425 | 0.366031 | 2.853906 | 0.370263 | 30.029181 | false | false | 2024-09-29 | 2024-09-29 | 0 | yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_2b-table-0.001 |
|
yfzp_Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yfzp__Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002 | 5ab3f2cfc96bdda3b5a629ab4a81adf7394ba90a | 23.522522 | 0 | 8.03 | false | false | false | true | 0.60769 | 0.701597 | 70.159732 | 0.499155 | 28.120615 | 0.073263 | 7.326284 | 0.259228 | 1.230425 | 0.377906 | 4.638281 | 0.366938 | 29.659796 | false | false | 2024-09-29 | 2024-09-29 | 0 | yfzp/Llama-3-8B-Instruct-SPPO-score-Iter1_gp_8b-table-0.002 |
|
yifAI_Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yifAI/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yifAI/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yifAI__Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yifAI/Llama-3-8B-Instruct-SPPO-score-Iter3_gp_8b-table-0.002 | 7a046b74179225d6055dd8aa601b5234f817b1e5 | 22.624782 | 0 | 8.03 | false | false | false | true | 0.672016 | 0.648966 | 64.896586 | 0.491452 | 27.281064 | 0.068731 | 6.873112 | 0.261745 | 1.565996 | 0.389875 | 7.134375 | 0.351978 | 27.997562 | false | false | 2024-09-30 | 0 | Removed |
||
ylalain_ECE-PRYMMAL-YL-1B-SLERP-V8_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ylalain__ECE-PRYMMAL-YL-1B-SLERP-V8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8 | 2c00dbc74e55d42fbc8b08f474fb9568f820edb9 | 9.604139 | apache-2.0 | 0 | 1.357 | true | false | false | false | 0.548428 | 0.150527 | 15.052727 | 0.397557 | 15.175392 | 0 | 0 | 0.28943 | 5.257271 | 0.387458 | 6.765625 | 0.238364 | 15.373818 | false | false | 2024-11-13 | 2024-11-13 | 0 | ylalain/ECE-PRYMMAL-YL-1B-SLERP-V8 |
ymcki_gemma-2-2b-ORPO-jpn-it-abliterated-18_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-ORPO-jpn-it-abliterated-18-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18 | aed2a9061ffa21beaec0d617a9605e160136aab4 | 14.633781 | gemma | 0 | 2.614 | true | false | false | true | 6.200402 | 0.463095 | 46.309459 | 0.40529 | 16.301992 | 0.003776 | 0.377644 | 0.288591 | 5.145414 | 0.375427 | 4.728385 | 0.234458 | 14.93979 | false | false | 2024-10-30 | 2024-11-16 | 3 | google/gemma-2-2b |
ymcki_gemma-2-2b-ORPO-jpn-it-abliterated-18-merge_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-ORPO-jpn-it-abliterated-18-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ymcki/gemma-2-2b-ORPO-jpn-it-abliterated-18-merge | b72be0a7879f0d82cb2024cfc1d02c370ce3efe8 | 15.737663 | gemma | 0 | 2.614 | true | false | false | true | 1.98799 | 0.521821 | 52.182099 | 0.414689 | 17.348337 | 0.008308 | 0.830816 | 0.283557 | 4.474273 | 0.351396 | 3.357813 | 0.246094 | 16.232639 | false | false | 2024-10-30 | 2024-11-16 | 3 | google/gemma-2-2b |
ymcki_gemma-2-2b-jpn-it-abliterated-17_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ymcki/gemma-2-2b-jpn-it-abliterated-17 | e6f82b93dae0b8207aa3252ab4157182e2610787 | 15.002982 | gemma | 1 | 2.614 | true | false | false | true | 1.104509 | 0.508157 | 50.815724 | 0.407627 | 16.234749 | 0 | 0 | 0.271812 | 2.908277 | 0.370062 | 3.891146 | 0.245512 | 16.167996 | false | false | 2024-10-16 | 2024-10-18 | 3 | google/gemma-2-2b |
ymcki_gemma-2-2b-jpn-it-abliterated-17-18-24_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17-18-24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17-18-24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-18-24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ymcki/gemma-2-2b-jpn-it-abliterated-17-18-24 | 38f56fcb99bd64278a1d90dd23aea527036329a0 | 14.019765 | gemma | 0 | 2.614 | true | false | false | true | 0.704859 | 0.505484 | 50.548434 | 0.381236 | 13.114728 | 0 | 0 | 0.28104 | 4.138702 | 0.350156 | 2.069531 | 0.228225 | 14.247193 | false | false | 2024-11-06 | 2024-11-06 | 3 | google/gemma-2-2b |
ymcki_gemma-2-2b-jpn-it-abliterated-17-ORPO_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO | 531b2e2043285cb40cd0433f5ad43441f8ac6b6c | 14.516851 | gemma | 1 | 2.614 | true | false | false | true | 9.681597 | 0.474785 | 47.478468 | 0.389798 | 14.389413 | 0.042296 | 4.229607 | 0.274329 | 3.243848 | 0.37676 | 4.528385 | 0.219082 | 13.231383 | false | false | 2024-10-18 | 2024-10-27 | 3 | google/gemma-2-2b |
ymcki_gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ymcki/gemma-2-2b-jpn-it-abliterated-17-ORPO-alpaca | 5503b5e892be463fa4b1d265b8ba9ba4304af012 | 12.001731 | 0 | 2.614 | false | false | false | true | 1.184666 | 0.306473 | 30.647349 | 0.40716 | 16.922412 | 0.000755 | 0.075529 | 0.269295 | 2.572707 | 0.396917 | 7.914583 | 0.2249 | 13.877807 | false | false | 2024-10-27 | 0 | Removed |
||
ymcki_gemma-2-2b-jpn-it-abliterated-18_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-18" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-18</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-18-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ymcki/gemma-2-2b-jpn-it-abliterated-18 | c50b85f9b60b444f85fe230b8d77fcbc7b18ef91 | 15.503245 | gemma | 1 | 2.614 | true | false | false | true | 1.052664 | 0.517525 | 51.752461 | 0.413219 | 17.143415 | 0 | 0 | 0.27349 | 3.131991 | 0.374156 | 4.269531 | 0.250499 | 16.722074 | false | false | 2024-10-15 | 2024-10-18 | 3 | google/gemma-2-2b |
ymcki_gemma-2-2b-jpn-it-abliterated-18-ORPO_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-18-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-18-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-18-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ymcki/gemma-2-2b-jpn-it-abliterated-18-ORPO | b9f41f53827b8a5a600546b41f63023bf84617a3 | 14.943472 | gemma | 0 | 2.614 | true | false | false | true | 1.610377 | 0.474235 | 47.423503 | 0.403894 | 16.538079 | 0.035498 | 3.549849 | 0.261745 | 1.565996 | 0.395333 | 7.416667 | 0.218501 | 13.166741 | false | false | 2024-10-22 | 2024-10-22 | 3 | google/gemma-2-2b |
ymcki_gemma-2-2b-jpn-it-abliterated-24_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/ymcki/gemma-2-2b-jpn-it-abliterated-24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ymcki/gemma-2-2b-jpn-it-abliterated-24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ymcki__gemma-2-2b-jpn-it-abliterated-24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ymcki/gemma-2-2b-jpn-it-abliterated-24 | 06c129ba5261ee88e32035c88f90ca11d835175d | 15.604076 | gemma | 0 | 2.614 | true | false | false | true | 0.810442 | 0.497866 | 49.786566 | 0.41096 | 16.77259 | 0 | 0 | 0.277685 | 3.691275 | 0.39149 | 7.002865 | 0.24734 | 16.371158 | false | false | 2024-10-24 | 2024-10-25 | 3 | google/gemma-2-2b |
yuvraj17_Llama3-8B-SuperNova-Spectrum-Hermes-DPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuvraj17__Llama3-8B-SuperNova-Spectrum-Hermes-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO | 0da9f780f7dd94ed1e10c8d3e082472ff2922177 | 18.075579 | apache-2.0 | 0 | 8.03 | true | false | false | true | 0.97203 | 0.46909 | 46.908979 | 0.439987 | 21.238563 | 0.055891 | 5.589124 | 0.302013 | 6.935123 | 0.401219 | 9.61901 | 0.263464 | 18.162677 | false | false | 2024-09-24 | 2024-09-30 | 0 | yuvraj17/Llama3-8B-SuperNova-Spectrum-Hermes-DPO |
yuvraj17_Llama3-8B-SuperNova-Spectrum-dare_ties_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuvraj17__Llama3-8B-SuperNova-Spectrum-dare_ties-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties | 998d15b32900bc230727c8a7984e005f611723e9 | 19.134801 | apache-2.0 | 0 | 8.03 | true | false | false | false | 0.914144 | 0.401271 | 40.127085 | 0.461579 | 23.492188 | 0.082326 | 8.232628 | 0.275168 | 3.355705 | 0.421094 | 11.003385 | 0.35738 | 28.597813 | true | false | 2024-09-22 | 2024-09-23 | 1 | yuvraj17/Llama3-8B-SuperNova-Spectrum-dare_ties (Merge) |
yuvraj17_Llama3-8B-abliterated-Spectrum-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/yuvraj17/Llama3-8B-abliterated-Spectrum-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">yuvraj17/Llama3-8B-abliterated-Spectrum-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/yuvraj17__Llama3-8B-abliterated-Spectrum-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | yuvraj17/Llama3-8B-abliterated-Spectrum-slerp | 28789950975ecf5aac846c3f2c0a5d6841651ee6 | 17.687552 | apache-2.0 | 0 | 8.03 | true | false | false | false | 0.82666 | 0.288488 | 28.848788 | 0.497791 | 28.54693 | 0.058157 | 5.81571 | 0.301174 | 6.823266 | 0.399823 | 11.011198 | 0.325715 | 25.079418 | true | false | 2024-09-22 | 2024-09-23 | 1 | yuvraj17/Llama3-8B-abliterated-Spectrum-slerp (Merge) |
zake7749_gemma-2-2b-it-chinese-kyara-dpo_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zake7749/gemma-2-2b-it-chinese-kyara-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zake7749/gemma-2-2b-it-chinese-kyara-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zake7749__gemma-2-2b-it-chinese-kyara-dpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zake7749/gemma-2-2b-it-chinese-kyara-dpo | bbc011dae0416c1664a0287f3a7a0f9563deac91 | 19.334585 | gemma | 8 | 2.614 | true | false | false | false | 1.279309 | 0.538208 | 53.820751 | 0.425746 | 19.061804 | 0.066465 | 6.646526 | 0.266779 | 2.237136 | 0.457563 | 16.761979 | 0.257314 | 17.479314 | false | false | 2024-08-18 | 2024-10-17 | 1 | zake7749/gemma-2-2b-it-chinese-kyara-dpo (Merge) |
zelk12_Gemma-2-TM-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/Gemma-2-TM-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Gemma-2-TM-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__Gemma-2-TM-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/Gemma-2-TM-9B | 42366d605e6bdad354a5632547e37d34d300ff7a | 30.151929 | 0 | 10.159 | false | false | false | true | 1.967893 | 0.804462 | 80.446216 | 0.598659 | 42.049491 | 0 | 0 | 0.346477 | 12.863535 | 0.41524 | 11.238281 | 0.408826 | 34.314051 | false | false | 2024-11-06 | 2024-11-06 | 1 | zelk12/Gemma-2-TM-9B (Merge) |
|
zelk12_MT-Gen1-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT-Gen1-gemma-2-9B | b78f8883614cbbdf182ebb4acf8a8c124bc782ae | 33.041356 | 0 | 10.159 | false | false | false | true | 3.362746 | 0.788625 | 78.862529 | 0.61 | 44.011247 | 0.133686 | 13.36858 | 0.346477 | 12.863535 | 0.421688 | 11.577604 | 0.438082 | 37.564642 | false | false | 2024-10-23 | 2024-10-23 | 1 | zelk12/MT-Gen1-gemma-2-9B (Merge) |
|
zelk12_MT-Gen2-GI-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT-Gen2-GI-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen2-GI-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen2-GI-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT-Gen2-GI-gemma-2-9B | e970fbcbf974f4626dcc6db7d2b02d4f24c72744 | 33.315847 | 1 | 10.159 | false | false | false | true | 1.868506 | 0.791398 | 79.139794 | 0.609556 | 44.002591 | 0.133686 | 13.36858 | 0.350671 | 13.422819 | 0.428323 | 12.673698 | 0.435588 | 37.287603 | false | false | 2024-11-10 | 2024-11-28 | 1 | zelk12/MT-Gen2-GI-gemma-2-9B (Merge) |
|
zelk12_MT-Gen2-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT-Gen2-gemma-2-9B | c723f8b9b7334fddd1eb8b6e5230b76fb18139a5 | 33.644495 | 1 | 10.159 | false | false | false | true | 1.989448 | 0.790749 | 79.074855 | 0.610049 | 44.107782 | 0.148792 | 14.879154 | 0.346477 | 12.863535 | 0.432292 | 13.303125 | 0.438747 | 37.63852 | false | false | 2024-11-10 | 2024-11-10 | 1 | zelk12/MT-Gen2-gemma-2-9B (Merge) |
|
zelk12_MT-Gen3-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT-Gen3-gemma-2-9B | 84627594655776ce67f1e01233113b658333fa71 | 32.936869 | 2 | 10.159 | false | false | false | true | 1.813248 | 0.802014 | 80.201421 | 0.609711 | 43.950648 | 0.114048 | 11.404834 | 0.348993 | 13.199105 | 0.421688 | 11.577604 | 0.435588 | 37.287603 | false | false | 2024-11-28 | 2024-11-30 | 1 | zelk12/MT-Gen3-gemma-2-9B (Merge) |
|
zelk12_MT-Gen4-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT-Gen4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT-Gen4-gemma-2-9B | d44beca936d18a5b4b65799487504c1097ae1cb2 | 33.722578 | gemma | 1 | 10.159 | true | false | false | true | 1.781683 | 0.788301 | 78.83006 | 0.610988 | 43.960404 | 0.165408 | 16.540785 | 0.354866 | 13.982103 | 0.422802 | 11.383594 | 0.438747 | 37.63852 | true | false | 2024-12-13 | 2024-12-13 | 1 | zelk12/MT-Gen4-gemma-2-9B (Merge) |
zelk12_MT-Gen5-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT-Gen5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Gen5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Gen5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT-Gen5-gemma-2-9B | aef27049b2a3c52138016e9602280150f70eae32 | 33.78338 | gemma | 1 | 10.159 | true | false | false | true | 1.787428 | 0.792322 | 79.232215 | 0.613279 | 44.398244 | 0.168429 | 16.8429 | 0.35151 | 13.534676 | 0.420167 | 10.8875 | 0.440243 | 37.804743 | true | false | 2024-12-22 | 2024-12-22 | 1 | zelk12/MT-Gen5-gemma-2-9B (Merge) |
zelk12_MT-Max-Merge_02012025163610-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT-Max-Merge_02012025163610-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Max-Merge_02012025163610-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Max-Merge_02012025163610-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT-Max-Merge_02012025163610-gemma-2-9B | 2f279c5c648c22e77327d0c0098f90b69312afd3 | 34.049126 | gemma | 1 | 10.159 | true | false | false | true | 1.846119 | 0.790749 | 79.074855 | 0.614224 | 44.501684 | 0.182024 | 18.202417 | 0.35151 | 13.534676 | 0.422802 | 11.25026 | 0.439578 | 37.730866 | true | false | 2025-01-02 | 2025-01-02 | 1 | zelk12/MT-Max-Merge_02012025163610-gemma-2-9B (Merge) |
zelk12_MT-Merge-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT-Merge-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT-Merge-gemma-2-9B | f4c3b001bc8692bcbbd7005b6f8db048e651aa46 | 33.393208 | 3 | 10.159 | false | false | false | true | 3.219056 | 0.803538 | 80.353795 | 0.611838 | 44.320842 | 0.13142 | 13.141994 | 0.348154 | 13.087248 | 0.425625 | 12.103125 | 0.43617 | 37.352246 | false | false | 2024-10-22 | 2024-10-22 | 1 | zelk12/MT-Merge-gemma-2-9B (Merge) |
|
zelk12_MT-Merge1-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT-Merge1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT-Merge1-gemma-2-9B | 71bb4577c877715f3f6646a224b184544639c856 | 33.130536 | 1 | 10.159 | false | false | false | true | 4.036662 | 0.788625 | 78.862529 | 0.61 | 44.058246 | 0.126888 | 12.688822 | 0.35151 | 13.534676 | 0.424385 | 12.148177 | 0.437417 | 37.490765 | false | false | 2024-11-07 | 2024-11-07 | 1 | zelk12/MT-Merge1-gemma-2-9B (Merge) |
|
zelk12_MT-Merge2-MU-gemma-2-MTg2MT1g2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT-Merge2-MU-gemma-2-MTg2MT1g2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge2-MU-gemma-2-MTg2MT1g2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge2-MU-gemma-2-MTg2MT1g2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT-Merge2-MU-gemma-2-MTg2MT1g2-9B | 6d73ec2204800f7978c376567d3c6361c0a072cd | 33.557528 | 2 | 10.159 | false | false | false | true | 1.844885 | 0.795595 | 79.559458 | 0.608389 | 43.8402 | 0.138218 | 13.821752 | 0.350671 | 13.422819 | 0.432229 | 13.228646 | 0.437251 | 37.472296 | false | false | 2024-11-25 | 2024-11-28 | 1 | zelk12/MT-Merge2-MU-gemma-2-MTg2MT1g2-9B (Merge) |
|
zelk12_MT-Merge2-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT-Merge2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT-Merge2-gemma-2-9B | a695e722e6fab77852f9fe59bbc4d69fe23c4208 | 33.498975 | 2 | 10.159 | false | false | false | true | 1.850791 | 0.787701 | 78.770108 | 0.610668 | 44.157197 | 0.155589 | 15.558912 | 0.350671 | 13.422819 | 0.421688 | 11.510938 | 0.438165 | 37.573877 | false | false | 2024-11-25 | 2024-11-25 | 1 | zelk12/MT-Merge2-gemma-2-9B (Merge) |
|
zelk12_MT-Merge3-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT-Merge3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT-Merge3-gemma-2-9B | 3f02f5e76d3aade3340307eb34b15bc9dd5a2023 | 33.192106 | gemma | 0 | 10.159 | true | false | false | true | 1.881021 | 0.785853 | 78.585265 | 0.610211 | 44.066073 | 0.133686 | 13.36858 | 0.348993 | 13.199105 | 0.42575 | 12.452083 | 0.437334 | 37.481531 | true | false | 2024-12-11 | 2024-12-11 | 1 | zelk12/MT-Merge3-gemma-2-9B (Merge) |
zelk12_MT-Merge4-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT-Merge4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT-Merge4-gemma-2-9B | 5f076ad8a3f3c403840a1cd572a6018bea34e889 | 34.12096 | gemma | 1 | 10.159 | true | false | false | true | 1.873385 | 0.780732 | 78.073179 | 0.611822 | 44.053492 | 0.188066 | 18.806647 | 0.352349 | 13.646532 | 0.429437 | 12.479688 | 0.438996 | 37.666223 | true | false | 2024-12-21 | 2024-12-21 | 1 | zelk12/MT-Merge4-gemma-2-9B (Merge) |
zelk12_MT-Merge5-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT-Merge5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-Merge5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-Merge5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT-Merge5-gemma-2-9B | d8adfc6c5395baaeb3f5e0b50c585ed3f662c4d9 | 33.647426 | gemma | 2 | 10.159 | true | false | false | true | 1.80079 | 0.784379 | 78.437878 | 0.612267 | 44.240598 | 0.155589 | 15.558912 | 0.353188 | 13.758389 | 0.428135 | 12.25026 | 0.438747 | 37.63852 | true | false | 2024-12-30 | 2024-12-30 | 1 | zelk12/MT-Merge5-gemma-2-9B (Merge) |
zelk12_MT-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT-gemma-2-9B | 24e1f894517b86dd866c1a5999ced4a5924dcd90 | 30.239612 | 2 | 10.159 | false | false | false | true | 3.023399 | 0.796843 | 79.684349 | 0.60636 | 43.324243 | 0.003021 | 0.302115 | 0.345638 | 12.751678 | 0.407115 | 9.55599 | 0.422374 | 35.819297 | false | false | 2024-10-11 | 2024-10-11 | 1 | zelk12/MT-gemma-2-9B (Merge) |
|
zelk12_MT1-Gen1-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT1-Gen1-gemma-2-9B | 939ac6c12059a18fc1117cdb3861f46816eff2fb | 33.232259 | 0 | 10.159 | false | false | false | true | 3.362485 | 0.797443 | 79.744301 | 0.611779 | 44.273282 | 0.122356 | 12.23565 | 0.34396 | 12.527964 | 0.430958 | 13.103125 | 0.437583 | 37.509235 | false | false | 2024-10-23 | 2024-10-24 | 1 | zelk12/MT1-Gen1-gemma-2-9B (Merge) |
|
zelk12_MT1-Gen2-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT1-Gen2-gemma-2-9B | aeaca7dc7d50a425a5d3c38d7c4a7daf1c772ad4 | 33.142398 | 2 | 10.159 | false | false | false | true | 1.995995 | 0.798367 | 79.836722 | 0.609599 | 43.919191 | 0.113293 | 11.329305 | 0.352349 | 13.646532 | 0.428354 | 12.844271 | 0.435505 | 37.278369 | false | false | 2024-11-11 | 2024-11-11 | 1 | zelk12/MT1-Gen2-gemma-2-9B (Merge) |
|
zelk12_MT1-Gen3-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT1-Gen3-gemma-2-9B | 5cc4ee1c70f08a5b1a195d43f044d9bf6fca29f5 | 32.964927 | 0 | 10.159 | false | false | false | true | 1.944877 | 0.795969 | 79.596914 | 0.610155 | 43.990306 | 0.117825 | 11.782477 | 0.348993 | 13.199105 | 0.424323 | 12.007031 | 0.434924 | 37.213726 | false | false | 2024-12-01 | 2024-12-01 | 1 | zelk12/MT1-Gen3-gemma-2-9B (Merge) |
|
zelk12_MT1-Gen4-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT1-Gen4-gemma-2-9B | 5eaf1ef67f32805c6fbc0b51418a8caf866661a2 | 31.507236 | gemma | 1 | 10.159 | true | false | false | true | 1.742062 | 0.794121 | 79.412071 | 0.605757 | 43.145368 | 0.049094 | 4.909366 | 0.347315 | 12.975391 | 0.423115 | 12.089323 | 0.428607 | 36.511894 | true | false | 2024-12-14 | 2024-12-14 | 1 | zelk12/MT1-Gen4-gemma-2-9B (Merge) |
zelk12_MT1-Gen5-IF-gemma-2-S2DMv1-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen5-IF-gemma-2-S2DMv1-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen5-IF-gemma-2-S2DMv1-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen5-IF-gemma-2-S2DMv1-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT1-Gen5-IF-gemma-2-S2DMv1-9B | 53a780fd3a2d42709a0f517cac019234d7d71267 | 30.810248 | 1 | 10.159 | false | false | false | true | 1.707127 | 0.792922 | 79.292167 | 0.6 | 42.201028 | 0.024924 | 2.492447 | 0.34396 | 12.527964 | 0.424479 | 12.593229 | 0.421792 | 35.754654 | false | false | 2024-12-24 | 2024-12-31 | 1 | zelk12/MT1-Gen5-IF-gemma-2-S2DMv1-9B (Merge) |
|
zelk12_MT1-Gen5-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT1-Gen5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Gen5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Gen5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT1-Gen5-gemma-2-9B | 4eb54f9a0a9f482537b0e79000ffe7fb9d024c38 | 30.636174 | gemma | 1 | 10.159 | true | false | false | true | 1.75523 | 0.779483 | 77.948288 | 0.601746 | 42.496764 | 0.032477 | 3.247734 | 0.346477 | 12.863535 | 0.419146 | 11.459896 | 0.422207 | 35.800827 | true | false | 2024-12-24 | 2024-12-24 | 1 | zelk12/MT1-Gen5-gemma-2-9B (Merge) |
zelk12_MT1-Max-Merge_02012025163610-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT1-Max-Merge_02012025163610-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-Max-Merge_02012025163610-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-Max-Merge_02012025163610-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT1-Max-Merge_02012025163610-gemma-2-9B | e9177c45a9dc1ff2ace378d4809ea92ff6e477c4 | 33.853363 | gemma | 1 | 10.159 | true | false | false | true | 1.87798 | 0.792872 | 79.28718 | 0.612267 | 44.226377 | 0.161631 | 16.163142 | 0.354866 | 13.982103 | 0.4255 | 11.8875 | 0.438165 | 37.573877 | true | false | 2025-01-04 | 2025-01-04 | 1 | zelk12/MT1-Max-Merge_02012025163610-gemma-2-9B (Merge) |
zelk12_MT1-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT1-gemma-2-9B | 3a5e77518ca9c3c8ea2edac4c03bc220ee91f3ed | 33.633829 | 2 | 10.159 | false | false | false | true | 3.345719 | 0.79467 | 79.467036 | 0.610875 | 44.161526 | 0.149547 | 14.954683 | 0.345638 | 12.751678 | 0.432229 | 13.161979 | 0.435755 | 37.306073 | false | false | 2024-10-12 | 2024-10-14 | 1 | zelk12/MT1-gemma-2-9B (Merge) |
|
zelk12_MT2-Gen1-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT2-Gen1-gemma-2-9B | 167abf8eb4ea01fecd42dc32ad68160c51a8685a | 32.460223 | 0 | 10.159 | false | false | false | true | 3.38321 | 0.785578 | 78.557782 | 0.61008 | 44.141103 | 0.101208 | 10.120846 | 0.343121 | 12.416107 | 0.424323 | 12.007031 | 0.437666 | 37.518469 | false | false | 2024-10-24 | 2024-10-27 | 1 | zelk12/MT2-Gen1-gemma-2-9B (Merge) |
|
zelk12_MT2-Gen2-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT2-Gen2-gemma-2-9B | 24c487499b5833424ffb9932eed838bb254f61b4 | 33.471172 | 3 | 10.159 | false | false | false | true | 2.037441 | 0.7889 | 78.890012 | 0.609292 | 44.044503 | 0.148036 | 14.803625 | 0.346477 | 12.863535 | 0.427021 | 12.577604 | 0.43883 | 37.647754 | false | false | 2024-11-12 | 2024-11-12 | 1 | zelk12/MT2-Gen2-gemma-2-9B (Merge) |
|
zelk12_MT2-Gen3-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT2-Gen3-gemma-2-9B | bb750c2b76328c6dbc9adf9ae3d09551f3723758 | 32.967895 | 1 | 10.159 | false | false | false | true | 1.924377 | 0.781007 | 78.100662 | 0.610477 | 44.007274 | 0.132931 | 13.293051 | 0.346477 | 12.863535 | 0.423083 | 12.052083 | 0.437417 | 37.490765 | false | false | 2024-12-04 | 2024-12-04 | 1 | zelk12/MT2-Gen3-gemma-2-9B (Merge) |
|
zelk12_MT2-Gen4-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT2-Gen4-gemma-2-9B | 7a07de3719c3b8b8e90e79a65798bcc4ef454fc6 | 31.860932 | gemma | 1 | 10.159 | true | false | false | true | 1.786436 | 0.789599 | 78.959937 | 0.609655 | 43.778362 | 0.083082 | 8.308157 | 0.345638 | 12.751678 | 0.412542 | 10.467708 | 0.432098 | 36.899749 | true | false | 2024-12-15 | 2024-12-15 | 1 | zelk12/MT2-Gen4-gemma-2-9B (Merge) |
zelk12_MT2-Gen5-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT2-Gen5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Gen5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Gen5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT2-Gen5-gemma-2-9B | 94711cc263eab1464fa6b01c28ee5171b4467d84 | 31.594551 | gemma | 0 | 10.159 | true | false | false | true | 1.762136 | 0.774912 | 77.491168 | 0.606393 | 43.124281 | 0.063444 | 6.344411 | 0.35151 | 13.534676 | 0.424417 | 12.385417 | 0.430186 | 36.687352 | true | false | 2024-12-25 | 2024-12-25 | 1 | zelk12/MT2-Gen5-gemma-2-9B (Merge) |
zelk12_MT2-Max-Merge_02012025163610-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT2-Max-Merge_02012025163610-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-Max-Merge_02012025163610-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-Max-Merge_02012025163610-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT2-Max-Merge_02012025163610-gemma-2-9B | 76d8a9cc371af30b5843fb69edc25ff767d6741f | 33.768996 | gemma | 0 | 10.159 | true | false | false | true | 1.828975 | 0.790149 | 79.014903 | 0.610846 | 44.040817 | 0.16994 | 16.993958 | 0.35151 | 13.534676 | 0.422833 | 11.354167 | 0.439079 | 37.675458 | true | false | 2025-01-07 | 2025-01-07 | 1 | zelk12/MT2-Max-Merge_02012025163610-gemma-2-9B (Merge) |
zelk12_MT2-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT2-gemma-2-9B | d20d7169ce0f53d586504c50b4b7dc470bf8a781 | 33.2825 | 1 | 10.159 | false | false | false | true | 3.19411 | 0.788575 | 78.857542 | 0.611511 | 44.167481 | 0.147281 | 14.728097 | 0.347315 | 12.975391 | 0.421656 | 11.540365 | 0.436835 | 37.426123 | false | false | 2024-10-14 | 2024-10-15 | 1 | zelk12/MT2-gemma-2-9B (Merge) |
|
zelk12_MT3-Gen1-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT3-Gen1-gemma-2-9B | cd78df9e67e2e710d8d305f5a03a92c01b1b425d | 31.054845 | 1 | 10.159 | false | false | false | true | 3.113666 | 0.783779 | 78.377926 | 0.610676 | 44.119495 | 0.032477 | 3.247734 | 0.346477 | 12.863535 | 0.415115 | 10.75599 | 0.43268 | 36.964391 | false | false | 2024-10-24 | 2024-10-28 | 1 | zelk12/MT3-Gen1-gemma-2-9B (Merge) |
|
zelk12_MT3-Gen2-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT3-Gen2-gemma-2-9B | e4ef057d20751d89934025e9088ba98d89b921b5 | 30.963626 | 1 | 10.159 | false | false | false | true | 1.919108 | 0.784329 | 78.432891 | 0.609147 | 43.940226 | 0.020393 | 2.039275 | 0.357383 | 14.317673 | 0.411115 | 10.022656 | 0.433261 | 37.029034 | false | false | 2024-11-20 | 2024-11-20 | 1 | zelk12/MT3-Gen2-gemma-2-9B (Merge) |
|
zelk12_MT3-Gen3-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT3-Gen3-gemma-2-9B | 4ad54d6295f6364aa87f7aaa2a7bd112fb92ec00 | 32.359994 | 1 | 10.159 | false | false | false | true | 1.904463 | 0.785628 | 78.562769 | 0.608889 | 43.78374 | 0.090634 | 9.063444 | 0.35151 | 13.534676 | 0.42575 | 12.51875 | 0.430269 | 36.696587 | false | false | 2024-12-07 | 2024-12-07 | 1 | zelk12/MT3-Gen3-gemma-2-9B (Merge) |
|
zelk12_MT3-Gen4-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT3-Gen4-gemma-2-9B | 22066f5a275797ae870d2c58e8c75ac933ee1adf | 34.492356 | gemma | 2 | 10.159 | true | false | false | true | 1.820853 | 0.773713 | 77.371264 | 0.610084 | 43.779591 | 0.204683 | 20.468278 | 0.347315 | 12.975391 | 0.447635 | 14.721094 | 0.438747 | 37.63852 | true | false | 2024-12-16 | 2024-12-16 | 1 | zelk12/MT3-Gen4-gemma-2-9B (Merge) |
zelk12_MT3-Gen5-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT3-Gen5-gemma-2-9B | b02315782a4719734b159220ca1eef0770d022a5 | 32.189706 | gemma | 1 | 10.159 | true | false | false | true | 1.940627 | 0.799017 | 79.901661 | 0.609862 | 43.951199 | 0.072508 | 7.250755 | 0.353188 | 13.758389 | 0.419115 | 11.422656 | 0.431682 | 36.853576 | true | false | 2024-12-26 | 2024-12-26 | 1 | zelk12/MT3-Gen5-gemma-2-9B (Merge) |
zelk12_MT3-Gen5-gemma-2-9B_v1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT3-Gen5-gemma-2-9B_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-Gen5-gemma-2-9B_v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-Gen5-gemma-2-9B_v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT3-Gen5-gemma-2-9B_v1 | 40bfcc25ff421ff83d67a9c46474a0b40abf4f4a | 32.846339 | gemma | 2 | 10.159 | true | false | false | true | 1.928152 | 0.799616 | 79.961613 | 0.611333 | 44.159602 | 0.109517 | 10.951662 | 0.348993 | 13.199105 | 0.420385 | 11.48151 | 0.435921 | 37.324542 | true | false | 2024-12-27 | 2024-12-27 | 1 | zelk12/MT3-Gen5-gemma-2-9B_v1 (Merge) |
zelk12_MT3-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT3-gemma-2-9B | d501b6ea59896fac3dc0a623501a5493b3573cde | 32.352524 | 1 | 10.159 | false | false | false | true | 3.136653 | 0.778609 | 77.860854 | 0.613078 | 44.248465 | 0.104985 | 10.498489 | 0.344799 | 12.639821 | 0.424292 | 11.903125 | 0.43268 | 36.964391 | false | false | 2024-10-15 | 2024-10-16 | 1 | zelk12/MT3-gemma-2-9B (Merge) |
|
zelk12_MT4-Gen1-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT4-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT4-Gen1-gemma-2-9B | 6ed2c66246c7f354decfd3579acb534dc4b0b48c | 33.544994 | 0 | 10.159 | false | false | false | true | 2.103561 | 0.7895 | 78.949964 | 0.609383 | 44.009524 | 0.150302 | 15.030211 | 0.34396 | 12.527964 | 0.432229 | 13.095313 | 0.438913 | 37.656989 | false | false | 2024-10-25 | 2024-10-29 | 1 | zelk12/MT4-Gen1-gemma-2-9B (Merge) |
|
zelk12_MT4-Gen2-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT4-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT4-Gen2-gemma-2-9B | 4d61a5799b11641a24e8b0f3eda0e987ff392089 | 33.794732 | 3 | 10.159 | false | false | false | true | 1.977047 | 0.805062 | 80.506168 | 0.610835 | 44.176658 | 0.1571 | 15.70997 | 0.345638 | 12.751678 | 0.425656 | 12.207031 | 0.436752 | 37.416888 | false | false | 2024-11-22 | 2024-11-22 | 1 | zelk12/MT4-Gen2-gemma-2-9B (Merge) |
|
zelk12_MT4-Gen3-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT4-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT4-Gen3-gemma-2-9B | f93026d28ca1707e8c21620be8558eed6be43b1c | 33.239752 | 0 | 10.159 | false | false | false | true | 1.958701 | 0.784054 | 78.405409 | 0.608711 | 43.89439 | 0.151057 | 15.10574 | 0.34396 | 12.527964 | 0.424323 | 11.940365 | 0.438082 | 37.564642 | false | false | 2024-12-08 | 2024-12-08 | 1 | zelk12/MT4-Gen3-gemma-2-9B (Merge) |
|
zelk12_MT4-Gen4-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT4-Gen4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Gen4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Gen4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT4-Gen4-gemma-2-9B | 51f3deb0aba90d82fc3f21894b3171fa5afbffa5 | 32.090103 | gemma | 0 | 10.159 | true | false | false | true | 1.793741 | 0.787426 | 78.742625 | 0.607603 | 43.47581 | 0.077039 | 7.703927 | 0.352349 | 13.646532 | 0.424354 | 12.044271 | 0.432347 | 36.927453 | true | false | 2024-12-19 | 2024-12-19 | 1 | zelk12/MT4-Gen4-gemma-2-9B (Merge) |
zelk12_MT4-Gen5-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT4-Gen5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-Gen5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-Gen5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT4-Gen5-gemma-2-9B | 59681ccdc7e6f1991cc5663464806665bc3bf4c8 | 33.423935 | gemma | 2 | 10.159 | true | false | false | true | 1.847362 | 0.778883 | 77.888336 | 0.610666 | 43.947892 | 0.148792 | 14.879154 | 0.356544 | 14.205817 | 0.426833 | 12.020833 | 0.438414 | 37.601581 | true | false | 2024-12-28 | 2024-12-28 | 1 | zelk12/MT4-Gen5-gemma-2-9B (Merge) |
zelk12_MT4-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT4-gemma-2-9B | 2167ea02baf9145a697a7d828a17c75b86e5e282 | 33.447349 | 0 | 10.159 | false | false | false | true | 3.155259 | 0.776161 | 77.616059 | 0.607314 | 43.553827 | 0.173716 | 17.371601 | 0.338087 | 11.744966 | 0.430927 | 12.999219 | 0.436586 | 37.398419 | false | false | 2024-10-16 | 2024-10-20 | 1 | zelk12/MT4-gemma-2-9B (Merge) |
|
zelk12_MT5-Gen1-gemma-2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/zelk12/MT5-Gen1-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen1-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/zelk12__MT5-Gen1-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | zelk12/MT5-Gen1-gemma-2-9B | 0291b776e80f38381788cd8f1fb2c3435ad891b5 | 31.897632 | 0 | 10.159 | false | false | false | true | 2.017253 | 0.78313 | 78.312987 | 0.611048 | 44.183335 | 0.068731 | 6.873112 | 0.347315 | 12.975391 | 0.420385 | 11.614844 | 0.436835 | 37.426123 | false | false | 2024-10-25 | 2024-10-31 | 1 | zelk12/MT5-Gen1-gemma-2-9B (Merge) |