maddes8cht commited on
Commit
66cf96e
1 Parent(s): ee5150e

"Update README.md"

Browse files
Files changed (1) hide show
  1. README.md +8 -0
README.md CHANGED
@@ -14,6 +14,14 @@ I'm constantly enhancing these model descriptions to provide you with the most r
14
  MPT-7b and MPT-30B are part of the family of Mosaic Pretrained Transformer (MPT) models, which use a modified transformer architecture optimized for efficient training and inference.
15
 
16
 
 
 
 
 
 
 
 
 
17
 
18
  # About GGUF format
19
 
 
14
  MPT-7b and MPT-30B are part of the family of Mosaic Pretrained Transformer (MPT) models, which use a modified transformer architecture optimized for efficient training and inference.
15
 
16
 
17
+ ---
18
+ # Brief
19
+ This is a finetuning of mpt-30-b by iamplus using the entire flan1m-GPT4 dataset.
20
+ However, the model is from july and not based on a current MPT model.
21
+
22
+ ---
23
+
24
+
25
 
26
  # About GGUF format
27