Update README.md
Browse files
README.md
CHANGED
|
@@ -10,7 +10,7 @@ tags:
|
|
| 10 |
|
| 11 |
The Mistral-Prot-v1-134M Large Language Model (LLM) is a pretrained generative protein molecule model with 133.8M parameters.
|
| 12 |
It is derived from Mixtral-8x7B-v0.1 model, which was simplified for protein: the number of layers and the hidden size were reduced.
|
| 13 |
-
The model was pretrained using
|
| 14 |
|
| 15 |
## Model Architecture
|
| 16 |
|
|
|
|
| 10 |
|
| 11 |
The Mistral-Prot-v1-134M Large Language Model (LLM) is a pretrained generative protein molecule model with 133.8M parameters.
|
| 12 |
It is derived from Mixtral-8x7B-v0.1 model, which was simplified for protein: the number of layers and the hidden size were reduced.
|
| 13 |
+
The model was pretrained using 10M protein strings from the uniprot 50 database.
|
| 14 |
|
| 15 |
## Model Architecture
|
| 16 |
|