Locutusque leaderboard-pr-bot commited on
Commit
ced2fc9
1 Parent(s): 450c5b2

Adding Evaluation Results (#4)

Browse files

- Adding Evaluation Results (f14cfcd237297d020b86e21322abb5562376a10c)


Co-authored-by: Open LLM Leaderboard PR Bot <[email protected]>

Files changed (1) hide show
  1. README.md +156 -5
README.md CHANGED
@@ -10,10 +10,148 @@ tags:
10
  datasets:
11
  - Locutusque/hercules-v2.0
12
  widget:
13
- - example_title: "Derivative"
14
- text: "<|im_start|>user\nIn this task, you are given a description of an object and its surroundings. Your task is to describe the motion of the object mentioned in the description using differential equations. The answer should be a mathematical expression in t. Description: A ball is thrown up at 30 ft/sec from 3 feet off the ground. Solve this step-by-step, explaining each step in detail.<|im_end|>\n<|im_start|>assistant\n"
15
- output:
16
- text: "Let h(t) be the height of the ball at time t after it was thrown. We know that when the ball is thrown, it has an initial velocity v0 = 30 ft/sec and initial height y0 = 3 ft. We also know that the acceleration due to gravity g = -32 ft/sec^2.\n\nTo solve for the motion of the ball, we can use the equation for vertical motion with acceleration due to gravity:\n\nh(t) = v0 * t + (1/2) * g * t^2 + y0\n\nSubstituting the values given in the problem, we get:\n\ny(t) = (30)(t) + (1/2)(-32)(t)^2 + 3\n\nNow, let's simplify the equation:\n\ny(t) = 30t - 16t^2 + 3\n\nThis is the final solution for the height of the ball as a function of time."
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
17
  ---
18
  # Model Card: Hercules-2.0-Mistral-7B
19
 
@@ -94,4 +232,17 @@ This model was fine-tuned using the TPU-Alignment repository. https://github.com
94
 
95
  # Quants
96
 
97
- exl2 by @bartowski https://huggingface.co/bartowski/Hercules-2.0-Mistral-7B-exl2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10
  datasets:
11
  - Locutusque/hercules-v2.0
12
  widget:
13
+ - example_title: Derivative
14
+ text: '<|im_start|>user
15
+
16
+ In this task, you are given a description of an object and its surroundings. Your
17
+ task is to describe the motion of the object mentioned in the description using
18
+ differential equations. The answer should be a mathematical expression in t. Description:
19
+ A ball is thrown up at 30 ft/sec from 3 feet off the ground. Solve this step-by-step,
20
+ explaining each step in detail.<|im_end|>
21
+
22
+ <|im_start|>assistant
23
+
24
+ '
25
+ output:
26
+ text: 'Let h(t) be the height of the ball at time t after it was thrown. We know
27
+ that when the ball is thrown, it has an initial velocity v0 = 30 ft/sec and
28
+ initial height y0 = 3 ft. We also know that the acceleration due to gravity
29
+ g = -32 ft/sec^2.
30
+
31
+
32
+ To solve for the motion of the ball, we can use the equation for vertical motion
33
+ with acceleration due to gravity:
34
+
35
+
36
+ h(t) = v0 * t + (1/2) * g * t^2 + y0
37
+
38
+
39
+ Substituting the values given in the problem, we get:
40
+
41
+
42
+ y(t) = (30)(t) + (1/2)(-32)(t)^2 + 3
43
+
44
+
45
+ Now, let''s simplify the equation:
46
+
47
+
48
+ y(t) = 30t - 16t^2 + 3
49
+
50
+
51
+ This is the final solution for the height of the ball as a function of time.'
52
+ model-index:
53
+ - name: Hercules-2.0-Mistral-7B
54
+ results:
55
+ - task:
56
+ type: text-generation
57
+ name: Text Generation
58
+ dataset:
59
+ name: AI2 Reasoning Challenge (25-Shot)
60
+ type: ai2_arc
61
+ config: ARC-Challenge
62
+ split: test
63
+ args:
64
+ num_few_shot: 25
65
+ metrics:
66
+ - type: acc_norm
67
+ value: 61.09
68
+ name: normalized accuracy
69
+ source:
70
+ url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Locutusque/Hercules-2.0-Mistral-7B
71
+ name: Open LLM Leaderboard
72
+ - task:
73
+ type: text-generation
74
+ name: Text Generation
75
+ dataset:
76
+ name: HellaSwag (10-Shot)
77
+ type: hellaswag
78
+ split: validation
79
+ args:
80
+ num_few_shot: 10
81
+ metrics:
82
+ - type: acc_norm
83
+ value: 83.69
84
+ name: normalized accuracy
85
+ source:
86
+ url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Locutusque/Hercules-2.0-Mistral-7B
87
+ name: Open LLM Leaderboard
88
+ - task:
89
+ type: text-generation
90
+ name: Text Generation
91
+ dataset:
92
+ name: MMLU (5-Shot)
93
+ type: cais/mmlu
94
+ config: all
95
+ split: test
96
+ args:
97
+ num_few_shot: 5
98
+ metrics:
99
+ - type: acc
100
+ value: 63.47
101
+ name: accuracy
102
+ source:
103
+ url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Locutusque/Hercules-2.0-Mistral-7B
104
+ name: Open LLM Leaderboard
105
+ - task:
106
+ type: text-generation
107
+ name: Text Generation
108
+ dataset:
109
+ name: TruthfulQA (0-shot)
110
+ type: truthful_qa
111
+ config: multiple_choice
112
+ split: validation
113
+ args:
114
+ num_few_shot: 0
115
+ metrics:
116
+ - type: mc2
117
+ value: 43.97
118
+ source:
119
+ url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Locutusque/Hercules-2.0-Mistral-7B
120
+ name: Open LLM Leaderboard
121
+ - task:
122
+ type: text-generation
123
+ name: Text Generation
124
+ dataset:
125
+ name: Winogrande (5-shot)
126
+ type: winogrande
127
+ config: winogrande_xl
128
+ split: validation
129
+ args:
130
+ num_few_shot: 5
131
+ metrics:
132
+ - type: acc
133
+ value: 79.48
134
+ name: accuracy
135
+ source:
136
+ url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Locutusque/Hercules-2.0-Mistral-7B
137
+ name: Open LLM Leaderboard
138
+ - task:
139
+ type: text-generation
140
+ name: Text Generation
141
+ dataset:
142
+ name: GSM8k (5-shot)
143
+ type: gsm8k
144
+ config: main
145
+ split: test
146
+ args:
147
+ num_few_shot: 5
148
+ metrics:
149
+ - type: acc
150
+ value: 44.43
151
+ name: accuracy
152
+ source:
153
+ url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=Locutusque/Hercules-2.0-Mistral-7B
154
+ name: Open LLM Leaderboard
155
  ---
156
  # Model Card: Hercules-2.0-Mistral-7B
157
 
 
232
 
233
  # Quants
234
 
235
+ exl2 by @bartowski https://huggingface.co/bartowski/Hercules-2.0-Mistral-7B-exl2
236
+ # [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
237
+ Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Hercules-2.0-Mistral-7B)
238
+
239
+ | Metric |Value|
240
+ |---------------------------------|----:|
241
+ |Avg. |62.69|
242
+ |AI2 Reasoning Challenge (25-Shot)|61.09|
243
+ |HellaSwag (10-Shot) |83.69|
244
+ |MMLU (5-Shot) |63.47|
245
+ |TruthfulQA (0-shot) |43.97|
246
+ |Winogrande (5-shot) |79.48|
247
+ |GSM8k (5-shot) |44.43|
248
+