Update README.md
Browse files
README.md
CHANGED
@@ -11,4 +11,22 @@ Impact on performance
|
|
11 |
We evaluated the models using a panel of giga-models (GPT-4o, Gemini Pro 1.5, and Claude-Sonnet 3.5). The scoring ranged from 0, indicating a model unsuitable
|
12 |
for the task, to 5, representing a model that fully met expectations. The evaluation was based on 67 instructions across four programming languages: Python,
|
13 |
Java, JavaScript, and Pseudo-code. All tests were conducted in a French-language context, and models were heavily penalized if they responded in another language,
|
14 |
-
even if the response was technically correct.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
11 |
We evaluated the models using a panel of giga-models (GPT-4o, Gemini Pro 1.5, and Claude-Sonnet 3.5). The scoring ranged from 0, indicating a model unsuitable
|
12 |
for the task, to 5, representing a model that fully met expectations. The evaluation was based on 67 instructions across four programming languages: Python,
|
13 |
Java, JavaScript, and Pseudo-code. All tests were conducted in a French-language context, and models were heavily penalized if they responded in another language,
|
14 |
+
even if the response was technically correct.
|
15 |
+
|
16 |
+
| model | score |
|
17 |
+
|:------------------------------------------------|------------:|
|
18 |
+
| gemini-1.5-pro | 4.50995 |
|
19 |
+
| gpt-4o | 4.50995 |
|
20 |
+
| claude3.5-sonnet | 4.48756 |
|
21 |
+
| deepseek-ai/DeepSeek-Coder-V2-Lite-Instruct | 4.23691 |
|
22 |
+
| meta-llama/Meta-Llama-3.1-70B-Instruct | 4.23383 |
|
23 |
+
| cmarkea/Meta-Llama-3.1-70B-Instruct-4bit | 4.1393 |
|
24 |
+
| cmarkea/Mixtral-8x7B-Instruct-v0.1-4bit | 3.801 |
|
25 |
+
| meta-llama/Meta-Llama-3.1-8B-Instruct | 3.72637 |
|
26 |
+
| mistralai/Mixtral-8x7B-Instruct-v0.1 | 3.33416 |
|
27 |
+
| codellama/CodeLlama-13b-Instruct-hf | 3.32836 |
|
28 |
+
| codellama/CodeLlama-34b-Instruct-hf | 3.27363 |
|
29 |
+
| codellama/CodeLlama-7b-Instruct-hf | 3.19403 |
|
30 |
+
| cmarkea/CodeLlama-34b-Instruct-hf-4bit | 3.12189 |
|
31 |
+
| codellama/CodeLlama-70b-Instruct-hf | 1.81592 |
|
32 |
+
| **cmarkea/CodeLlama-70b-Instruct-hf-4bit** | **1.6409** |
|