Upload folder using huggingface_hub
Browse files
README.md
CHANGED
@@ -97,6 +97,38 @@ How may I assist you today?
|
|
97 |
## Evaluation
|
98 |
```
|
99 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
100 |
|
101 |
## Disclaimer
|
102 |
We must inform users that despite our best efforts in data cleansing, the possibility of uncensored content slipping through cannot be entirely ruled out.
|
@@ -109,5 +141,9 @@ If you are interested in customized LLMs for business applications, please get i
|
|
109 |
## Collaborations
|
110 |
We are also keenly seeking support and investment for our startup, VAGO solutions, where we continuously advance the development of robust language models designed to address a diverse range of purposes and requirements. If the prospect of collaboratively navigating future challenges excites you, we warmly invite you to reach out to us.
|
111 |
|
|
|
|
|
112 |
## Acknowledgement
|
|
|
|
|
113 |
Many thanks to [argilla](https://huggingface.co/datasets/argilla) and [Huggingface](https://huggingface.co) for providing such valuable datasets to the Open-Source community. And of course a big thanks to [upstage](https://huggingface.co/upstage) for providing the open source community with their latest technology!
|
|
|
97 |
## Evaluation
|
98 |
```
|
99 |
|
100 |
+
hf (pretrained=fblgit/LUNA-SOLARkrautLM-Instruct), gen_kwargs: (), limit: None, num_fewshot: 5, batch_size: auto
|
101 |
+
|Tasks|Version| Filter |n-shot| Metric |Value | |Stderr|
|
102 |
+
|-----|-------|----------|-----:|-----------|-----:|---|-----:|
|
103 |
+
|gsm8k|Yaml |get-answer| 5|exact_match|0.6467|± |0.0132|
|
104 |
+
|
105 |
+
hf (pretrained=fblgit/LUNA-SOLARkrautLM-Instruct), gen_kwargs: (), limit: None, num_fewshot: 0, batch_size: auto (64)
|
106 |
+
| Tasks |Version|Filter|n-shot|Metric|Value | |Stderr|
|
107 |
+
|--------------|-------|------|-----:|------|-----:|---|-----:|
|
108 |
+
|truthfulqa_mc2|Yaml |none | 0|acc |0.7368|± |0.0149|
|
109 |
+
|
110 |
+
hf (pretrained=fblgit/LUNA-SOLARkrautLM-Instruct), gen_kwargs: (), limit: None, num_fewshot: 25, batch_size: auto (32)
|
111 |
+
| Tasks |Version|Filter|n-shot| Metric |Value| |Stderr|
|
112 |
+
|-------------|-------|------|-----:|--------|----:|---|-----:|
|
113 |
+
|arc_challenge|Yaml |none | 25|acc |0.692|± |0.0135|
|
114 |
+
| | |none | 25|acc_norm|0.715|± |0.0132|
|
115 |
+
|
116 |
+
hf (pretrained=fblgit/LUNA-SOLARkrautLM-Instruct), gen_kwargs: (), limit: None, num_fewshot: 0, batch_size: auto (64)
|
117 |
+
| Tasks |Version|Filter|n-shot|Metric| Value | |Stderr|
|
118 |
+
|-----------|-------|------|-----:|------|------:|---|-----:|
|
119 |
+
|paws_de |Yaml |none | 0|acc | 0.3965|± |0.0109|
|
120 |
+
|wmt16-en-de|Yaml |none | 0|bleu | 3.5784|± |0.1325|
|
121 |
+
| | |none | 0|ter |64.5707|± |0.4514|
|
122 |
+
| | |none | 0|chrf |45.7068|± |0.3861|
|
123 |
+
|xnli_de |Yaml |none | 0|acc | 0.4129|± |0.0099|
|
124 |
+
|
125 |
+
hf (pretrained=fblgit/LUNA-SOLARkrautLM-Instruct), gen_kwargs: (), limit: None, num_fewshot: 10, batch_size: auto (32)
|
126 |
+
| Tasks |Version|Filter|n-shot| Metric |Value | |Stderr|
|
127 |
+
|---------|-------|------|-----:|--------|-----:|---|-----:|
|
128 |
+
|hellaswag|Yaml |none | 10|acc |0.7131|± |0.0045|
|
129 |
+
| | |none | 10|acc_norm|0.8815|± |0.0032|
|
130 |
+
|
131 |
+
```
|
132 |
|
133 |
## Disclaimer
|
134 |
We must inform users that despite our best efforts in data cleansing, the possibility of uncensored content slipping through cannot be entirely ruled out.
|
|
|
141 |
## Collaborations
|
142 |
We are also keenly seeking support and investment for our startup, VAGO solutions, where we continuously advance the development of robust language models designed to address a diverse range of purposes and requirements. If the prospect of collaboratively navigating future challenges excites you, we warmly invite you to reach out to us.
|
143 |
|
144 |
+
Juanako.AI is also seeking support and investment for our startup, we also are open for collaborating with other labs to make awesome models like this one.
|
145 |
+
|
146 |
## Acknowledgement
|
147 |
+
Big Hug to VAGO Solutions, we merely used our transformers library on their code and dataset, nothing else. This won't be possible without them, thanks!
|
148 |
+
|
149 |
Many thanks to [argilla](https://huggingface.co/datasets/argilla) and [Huggingface](https://huggingface.co) for providing such valuable datasets to the Open-Source community. And of course a big thanks to [upstage](https://huggingface.co/upstage) for providing the open source community with their latest technology!
|