--- base_model: OFA-Sys/chinese-clip-vit-base-patch16 tags: - generated_from_trainer metrics: - accuracy model-index: - name: aoi_clip_high_resolution results: [] --- [Visualize in Weights & Biases](https://wandb.ai/shark_meow_team/huggingface/runs/p4n09c36) # aoi_clip_high_resolution This model is a fine-tuned version of [OFA-Sys/chinese-clip-vit-base-patch16](https://huggingface.co/OFA-Sys/chinese-clip-vit-base-patch16) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 4.7487 - Accuracy: 0.0334 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 40 - eval_batch_size: 40 - seed: 42 - gradient_accumulation_steps: 5 - total_train_batch_size: 200 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 60.0 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-------:|:-----:|:---------------:|:--------:| | 2.4298 | 5.9923 | 1866 | 3.7282 | 0.0434 | | 2.246 | 11.9846 | 3732 | 3.9666 | 0.0379 | | 2.1293 | 17.9769 | 5598 | 4.1019 | 0.0360 | | 2.0539 | 23.9692 | 7464 | 4.2821 | 0.0352 | | 2.0135 | 29.9615 | 9330 | 4.3318 | 0.0345 | | 1.9879 | 35.9538 | 11196 | 4.3700 | 0.0341 | | 1.9648 | 41.9461 | 13062 | 4.4619 | 0.0341 | | 1.9495 | 47.9383 | 14928 | 4.5999 | 0.0341 | | 1.9391 | 53.9306 | 16794 | 4.6806 | 0.0339 | | 1.9372 | 59.9229 | 18660 | 4.7487 | 0.0336 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1