File size: 3,160 Bytes
e82f21d
 
 
c302d64
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e82f21d
 
 
 
 
 
 
 
 
54a41f5
 
e82f21d
 
 
 
 
8ce392e
 
e82f21d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
---
license: apache-2.0
---
<style>
table {
    border-collapse: collapse;
    width: 100%;
    margin-bottom: 20px;
}
th, td {
    border: 1px solid #ddd;
    padding: 8px;
    text-align: center;
}
.best {
    font-weight: bold;
    text-decoration: underline;
}
</style>

<div style="text-align: center; margin: 20px auto; padding: 20px; border: 3px solid #ddd; border-radius: 10px;">
  <h2 style="margin-bottom: 4px; margin-top: 0px;">OuteAI</h2>
  <a href="https://www.outeai.com/" target="_blank" style="margin-right: 10px;">🌎 OuteAI.com</a> 
  <a href="https://discord.gg/vyBM87kAmf" target="_blank" style="margin-right: 10px;">🤝 Join our Discord</a>
  <a href="https://x.com/OuteAI" target="_blank">𝕏 @OuteAI</a>
</div>

# Lite-Oute-1-300M

Lite-Oute-1-300M (Base) is a Lite series model based on the Mistral architecture, comprising approximately 300 million parameters. <br>
This model is specifically designed as a starting point for fine-tuning on various tasks. With its 300 million parameters, it offers a balance between compact size and capability, making it suitable for a wide range of fine-tuning applications.<br>
The model was trained on 30 billion tokens with a context length of 4096, providing a solid foundation for task-specific adaptations.

## Available versions:
<a href="https://huggingface.co/OuteAI/Lite-Oute-1-300M-Instruct">Lite-Oute-1-300M-Instruct</a> <br>
<a href="https://huggingface.co/OuteAI/Lite-Oute-1-300M-Instruct-GGUF">Lite-Oute-1-300M-Instruct-GGUF</a> <br>
<a href="https://huggingface.co/OuteAI/Lite-Oute-1-300M">Lite-Oute-1-300M</a> <br>
<a href="https://huggingface.co/OuteAI/Lite-Oute-1-300M-GGUF">Lite-Oute-1-300M-GGUF</a> <br>

## Benchmarks:
<table style="text-align: left;">
  <tr>
    <th>Benchmark</th>
    <th>5-shot</th>
    <th>0-shot</th>
  </tr>
  <tr>
    <td>ARC Challenge</td>
    <td>26.62</td>
    <td>26.28</td>
  </tr>
  <tr>
    <td>ARC Easy</td>
    <td>51.39</td>
    <td>48.11</td>
  </tr>
  <tr>
    <td>CommonsenseQA</td>
    <td>19.49</td>
    <td>20.64</td>
  </tr>
  <tr>
    <td>HellaSWAG</td>
    <td>34.86</td>
    <td>34.85</td>
  </tr>
  <tr>
    <td>MMLU</td>
    <td>27.23</td>
    <td>24.87</td>
  </tr>
  <tr>
    <td>OpenBookQA</td>
    <td>30.20</td>
    <td>30.80</td>
  </tr>
  <tr>
    <td>PIQA</td>
    <td>65.07</td>
    <td>65.02</td>
  </tr>
  <tr>
    <td>Winogrande</td>
    <td>51.14</td>
    <td>53.35</td>
  </tr>
</table>

## Risk Disclaimer

By using this model, you acknowledge that you understand and assume the risks associated with its use. You are solely responsible for ensuring compliance with all applicable laws and regulations. We disclaim any liability for problems arising from the use of this open-source model, including but not limited to direct, indirect, incidental, consequential, or punitive damages. We make no warranties, express or implied, regarding the model's performance, accuracy, or fitness for a particular purpose. Your use of this model is at your own risk, and you agree to hold harmless and indemnify us, our affiliates, and our contributors from any claims, damages, or expenses arising from your use of the model.