--- library_name: transformers license: cc-by-nc-sa-4.0 base_model: InstaDeepAI/nucleotide-transformer-500m-human-ref tags: - generated_from_trainer model-index: - name: nucleotide-transformer-finetuned-lora-NucleotideTransformer results: [] --- # nucleotide-transformer-finetuned-lora-NucleotideTransformer This model is a fine-tuned version of [InstaDeepAI/nucleotide-transformer-500m-human-ref](https://huggingface.co/InstaDeepAI/nucleotide-transformer-500m-human-ref) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.2358 - F1 Score: 0.9363 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0005 - train_batch_size: 8 - eval_batch_size: 64 - seed: 42 - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - training_steps: 1000 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 Score | |:-------------:|:------:|:----:|:---------------:|:--------:| | 0.6471 | 0.0158 | 100 | 0.3372 | 0.8961 | | 0.3843 | 0.0316 | 200 | 0.2311 | 0.9029 | | 0.3449 | 0.0474 | 300 | 0.2503 | 0.9145 | | 0.3466 | 0.0632 | 400 | 0.2059 | 0.9211 | | 0.2615 | 0.0790 | 500 | 0.2347 | 0.9281 | | 0.295 | 0.0948 | 600 | 0.2221 | 0.9199 | | 0.272 | 0.1107 | 700 | 0.2228 | 0.9303 | | 0.2507 | 0.1265 | 800 | 0.2339 | 0.9226 | | 0.2356 | 0.1423 | 900 | 0.2088 | 0.9388 | | 0.1854 | 0.1581 | 1000 | 0.2358 | 0.9363 | ### Framework versions - Transformers 4.47.1 - Pytorch 2.5.1+cu118 - Datasets 3.2.0 - Tokenizers 0.21.0