|
|
|
--- |
|
|
|
base_model: ProTrekHub/Protein_Encoder_35M |
|
|
|
library_name: peft |
|
|
|
--- |
|
|
|
|
|
|
|
# Model Card for Model-demo-35M |
|
This model is used for a demo classification task |
|
|
|
## Task type |
|
Protein-level Classification |
|
|
|
## Model input type |
|
SA Sequence |
|
|
|
## Label meanings |
|
|
|
0: None |
|
|
|
|
|
## LoRA config |
|
|
|
- **r:** 8 |
|
- **lora_dropout:** 0.0 |
|
- **lora_alpha:** 16 |
|
- **target_modules:** ['output.dense', 'key', 'value', 'query', 'intermediate.dense'] |
|
- **modules_to_save:** ['classifier'] |
|
|
|
## Training config |
|
|
|
- **optimizer:** |
|
- **class:** AdamW |
|
- **betas:** (0.9, 0.98) |
|
- **weight_decay:** 0.01 |
|
- **learning rate:** 0.001 |
|
- **epoch:** 10 |
|
- **batch size:** 2 |
|
- **precision:** 16-mixed |
|
|
|
|