Fawazzx/SaulLm_Finetuned_10k
Browse files
README.md
CHANGED
@@ -43,7 +43,7 @@ The following hyperparameters were used during training:
|
|
43 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
44 |
- lr_scheduler_type: linear
|
45 |
- lr_scheduler_warmup_steps: 5
|
46 |
-
- training_steps:
|
47 |
- mixed_precision_training: Native AMP
|
48 |
|
49 |
### Training results
|
@@ -55,5 +55,5 @@ The following hyperparameters were used during training:
|
|
55 |
- PEFT 0.10.0
|
56 |
- Transformers 4.40.1
|
57 |
- Pytorch 2.1.0+cu121
|
58 |
-
- Datasets 2.19.
|
59 |
- Tokenizers 0.19.1
|
|
|
43 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
44 |
- lr_scheduler_type: linear
|
45 |
- lr_scheduler_warmup_steps: 5
|
46 |
+
- training_steps: 100
|
47 |
- mixed_precision_training: Native AMP
|
48 |
|
49 |
### Training results
|
|
|
55 |
- PEFT 0.10.0
|
56 |
- Transformers 4.40.1
|
57 |
- Pytorch 2.1.0+cu121
|
58 |
+
- Datasets 2.19.1
|
59 |
- Tokenizers 0.19.1
|
adapter_model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:e44ce263e6fd885f50d82ca515b9325375b43ee36ededb75acf161ce88bc2e41
|
3 |
+
size 48
|
runs/May07_07-11-22_eba1f7ea0bcc/events.out.tfevents.1715065889.eba1f7ea0bcc.1558.0
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:73d070853737bd1805fe42354fb4aa49021f3a13bea04856989a7d931b129840
|
3 |
+
size 26175
|
training_args.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 4984
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:940c7a95acf1286a19cf301064141177f181eb229ac35f060c23d70db18f23c6
|
3 |
size 4984
|