Model save
Browse files- README.md +3 -5
- training_args.bin +1 -1
README.md
CHANGED
@@ -2,8 +2,6 @@
|
|
2 |
license: apache-2.0
|
3 |
base_model: SenseTime/deformable-detr
|
4 |
tags:
|
5 |
-
- object-detection
|
6 |
-
- vision
|
7 |
- generated_from_trainer
|
8 |
model-index:
|
9 |
- name: sensetime-deformable-detr-finetuned-10k-cppe5-manual-pad
|
@@ -13,10 +11,10 @@ model-index:
|
|
13 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
14 |
should probably proofread and complete it, then remove this comment. -->
|
15 |
|
16 |
-
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/qubvel-hf-co/transformers-detection-model-finetuning-cppe5/runs/
|
17 |
# sensetime-deformable-detr-finetuned-10k-cppe5-manual-pad
|
18 |
|
19 |
-
This model is a fine-tuned version of [SenseTime/deformable-detr](https://huggingface.co/SenseTime/deformable-detr) on
|
20 |
It achieves the following results on the evaluation set:
|
21 |
- Loss: 1.0442
|
22 |
- Map: 0.3514
|
@@ -61,7 +59,7 @@ More information needed
|
|
61 |
The following hyperparameters were used during training:
|
62 |
- learning_rate: 5e-05
|
63 |
- train_batch_size: 4
|
64 |
-
- eval_batch_size:
|
65 |
- seed: 1337
|
66 |
- gradient_accumulation_steps: 2
|
67 |
- total_train_batch_size: 8
|
|
|
2 |
license: apache-2.0
|
3 |
base_model: SenseTime/deformable-detr
|
4 |
tags:
|
|
|
|
|
5 |
- generated_from_trainer
|
6 |
model-index:
|
7 |
- name: sensetime-deformable-detr-finetuned-10k-cppe5-manual-pad
|
|
|
11 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
12 |
should probably proofread and complete it, then remove this comment. -->
|
13 |
|
14 |
+
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/qubvel-hf-co/transformers-detection-model-finetuning-cppe5/runs/zv86s9w6)
|
15 |
# sensetime-deformable-detr-finetuned-10k-cppe5-manual-pad
|
16 |
|
17 |
+
This model is a fine-tuned version of [SenseTime/deformable-detr](https://huggingface.co/SenseTime/deformable-detr) on an unknown dataset.
|
18 |
It achieves the following results on the evaluation set:
|
19 |
- Loss: 1.0442
|
20 |
- Map: 0.3514
|
|
|
59 |
The following hyperparameters were used during training:
|
60 |
- learning_rate: 5e-05
|
61 |
- train_batch_size: 4
|
62 |
+
- eval_batch_size: 8
|
63 |
- seed: 1337
|
64 |
- gradient_accumulation_steps: 2
|
65 |
- total_train_batch_size: 8
|
training_args.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 4923
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:98adaa035dc80492e4fa66bfeb25d486f379832ec881e30bfdac8f750bc376d8
|
3 |
size 4923
|