File size: 495 Bytes
938bce8
 
8f970b3
 
 
 
 
938bce8
0494330
 
 
1
2
3
4
5
6
7
8
9
10
11
---
license: apache-2.0
datasets:
- Skylion007/openwebtext
language:
- en
pipeline_tag: text-generation
---
A pre-trained language model based on the Mistral 7B model, shrunk to approximately 248 million parameters, required minimal training. Convergence was achieved with only 250,000 examples over 125,000 steps. This model is not intended for direct use but rather for fine-tuning on a downstream task.

During evaluation on InstructMix, this model achieved an average perplexity score of 6.3