nguyenkhoa commited on
Commit
44c10b5
·
verified ·
1 Parent(s): 02dcc28

Model save

Browse files
README.md CHANGED
@@ -1,14 +1,8 @@
1
  ---
2
  library_name: transformers
3
- license: apache-2.0
4
- base_model: facebook/dinov2-small
5
  tags:
6
  - generated_from_trainer
7
- metrics:
8
- - accuracy
9
- - f1
10
- - recall
11
- - precision
12
  model-index:
13
  - name: dinov2_Liveness_detection_v2.2.1
14
  results: []
@@ -17,16 +11,10 @@ model-index:
17
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
18
  should probably proofread and complete it, then remove this comment. -->
19
 
20
- [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/nguyenkhoaht002/liveness_detection/runs/wo6b0psl)
21
  # dinov2_Liveness_detection_v2.2.1
22
 
23
- This model is a fine-tuned version of [facebook/dinov2-small](https://huggingface.co/facebook/dinov2-small) on an unknown dataset.
24
- It achieves the following results on the evaluation set:
25
- - Loss: 0.0671
26
- - Accuracy: 0.9869
27
- - F1: 0.9868
28
- - Recall: 0.9869
29
- - Precision: 0.9870
30
 
31
  ## Model description
32
 
@@ -46,39 +34,17 @@ More information needed
46
 
47
  The following hyperparameters were used during training:
48
  - learning_rate: 5e-05
49
- - train_batch_size: 512
50
  - eval_batch_size: 8
51
  - seed: 42
52
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
53
  - lr_scheduler_type: linear
54
  - num_epochs: 5
55
  - mixed_precision_training: Native AMP
56
 
57
- ### Training results
58
-
59
- | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Recall | Precision |
60
- |:-------------:|:------:|:----:|:---------------:|:--------:|:------:|:------:|:---------:|
61
- | 0.183 | 0.8153 | 128 | 0.2473 | 0.9016 | 0.9039 | 0.9016 | 0.9123 |
62
- | 0.1022 | 1.6306 | 256 | 0.0750 | 0.9729 | 0.9727 | 0.9729 | 0.9737 |
63
- | 0.0432 | 2.4459 | 384 | 0.0575 | 0.9820 | 0.9820 | 0.9820 | 0.9823 |
64
- | 0.0247 | 3.2611 | 512 | 0.0507 | 0.9832 | 0.9832 | 0.9832 | 0.9833 |
65
- | 0.0115 | 4.0764 | 640 | 0.0536 | 0.9865 | 0.9864 | 0.9865 | 0.9866 |
66
- | 0.002 | 4.8917 | 768 | 0.0671 | 0.9869 | 0.9868 | 0.9869 | 0.9870 |
67
-
68
- ### Evaluate results
69
-
70
- - APCER: 0.2490
71
- - BPCER: 0.0676
72
- - ACER: 0.1583
73
-
74
- - Accuracy: 0.81
75
- - F1: 0.84
76
- - Recall: 0.93
77
- - Precision: 0.62
78
-
79
  ### Framework versions
80
 
81
- - Transformers 4.44.2
82
- - Pytorch 2.4.1+cu121
83
  - Datasets 3.2.0
84
- - Tokenizers 0.19.1
 
1
  ---
2
  library_name: transformers
3
+ base_model: nguyenkhoa/dinov2_Liveness_detection_v2.1.4
 
4
  tags:
5
  - generated_from_trainer
 
 
 
 
 
6
  model-index:
7
  - name: dinov2_Liveness_detection_v2.2.1
8
  results: []
 
11
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
  should probably proofread and complete it, then remove this comment. -->
13
 
14
+ [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/nguyenkhoaht002/liveness_detection/runs/zdrli5b6)
15
  # dinov2_Liveness_detection_v2.2.1
16
 
17
+ This model is a fine-tuned version of [nguyenkhoa/dinov2_Liveness_detection_v2.1.4](https://huggingface.co/nguyenkhoa/dinov2_Liveness_detection_v2.1.4) on an unknown dataset.
 
 
 
 
 
 
18
 
19
  ## Model description
20
 
 
34
 
35
  The following hyperparameters were used during training:
36
  - learning_rate: 5e-05
37
+ - train_batch_size: 768
38
  - eval_batch_size: 8
39
  - seed: 42
40
+ - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
41
  - lr_scheduler_type: linear
42
  - num_epochs: 5
43
  - mixed_precision_training: Native AMP
44
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
45
  ### Framework versions
46
 
47
+ - Transformers 4.47.0
48
+ - Pytorch 2.5.1+cu121
49
  - Datasets 3.2.0
50
+ - Tokenizers 0.21.0
runs/Jan22_04-19-16_ab507d0761b0/events.out.tfevents.1737519560.ab507d0761b0.18.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:8362d3a500922a5414ae0e0acee5bb14e9717164c0cd9ac73b6c1e2c28943d8d
3
- size 15806
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ee6d1916c7eea4d80038418cc3d6b8187c014d605a922f776e531650a7282b06
3
+ size 16160