Anish commited on
Commit
6d0cfa7
·
verified ·
1 Parent(s): 702e4a9

End of training

Browse files
Files changed (3) hide show
  1. README.md +118 -0
  2. generation_config.json +5 -0
  3. model.safetensors +1 -1
README.md ADDED
@@ -0,0 +1,118 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: google/muril-large-cased
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: muril-large-cased-tweet-devnagri-grouped
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # muril-large-cased-tweet-devnagri-grouped
15
+
16
+ This model is a fine-tuned version of [google/muril-large-cased](https://huggingface.co/google/muril-large-cased) on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 1.4110
19
+
20
+ ## Model description
21
+
22
+ More information needed
23
+
24
+ ## Intended uses & limitations
25
+
26
+ More information needed
27
+
28
+ ## Training and evaluation data
29
+
30
+ More information needed
31
+
32
+ ## Training procedure
33
+
34
+ ### Training hyperparameters
35
+
36
+ The following hyperparameters were used during training:
37
+ - learning_rate: 2e-05
38
+ - train_batch_size: 64
39
+ - eval_batch_size: 64
40
+ - seed: 42
41
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
+ - lr_scheduler_type: linear
43
+ - num_epochs: 3
44
+
45
+ ### Training results
46
+
47
+ | Training Loss | Epoch | Step | Validation Loss |
48
+ |:-------------:|:------:|:------:|:---------------:|
49
+ | No log | 0.0478 | 5000 | 2.5496 |
50
+ | No log | 0.0955 | 10000 | 2.1840 |
51
+ | No log | 0.1433 | 15000 | 2.0172 |
52
+ | No log | 0.1910 | 20000 | 1.9188 |
53
+ | No log | 0.2388 | 25000 | 1.8525 |
54
+ | No log | 0.2865 | 30000 | 1.8047 |
55
+ | No log | 0.3343 | 35000 | 1.7694 |
56
+ | No log | 0.3820 | 40000 | 1.7406 |
57
+ | No log | 0.4298 | 45000 | 1.7076 |
58
+ | No log | 0.4775 | 50000 | 1.6848 |
59
+ | No log | 0.5253 | 55000 | 1.6713 |
60
+ | No log | 0.5730 | 60000 | 1.6543 |
61
+ | No log | 0.6208 | 65000 | 1.6364 |
62
+ | No log | 0.6685 | 70000 | 1.6226 |
63
+ | No log | 0.7163 | 75000 | 1.6103 |
64
+ | No log | 0.7640 | 80000 | 1.5976 |
65
+ | No log | 0.8118 | 85000 | 1.5925 |
66
+ | No log | 0.8595 | 90000 | 1.5883 |
67
+ | No log | 0.9073 | 95000 | 1.5763 |
68
+ | No log | 0.9550 | 100000 | 1.5581 |
69
+ | 1.9195 | 1.0028 | 105000 | 1.5774 |
70
+ | 1.9195 | 1.0505 | 110000 | 1.5507 |
71
+ | 1.9195 | 1.0983 | 115000 | 1.5728 |
72
+ | 1.9195 | 1.1460 | 120000 | 1.5328 |
73
+ | 1.9195 | 1.1938 | 125000 | 1.5265 |
74
+ | 1.9195 | 1.2415 | 130000 | 1.5199 |
75
+ | 1.9195 | 1.2893 | 135000 | 1.5216 |
76
+ | 1.9195 | 1.3370 | 140000 | 1.5098 |
77
+ | 1.9195 | 1.3848 | 145000 | 1.5061 |
78
+ | 1.9195 | 1.4325 | 150000 | 1.4985 |
79
+ | 1.9195 | 1.4803 | 155000 | 1.4943 |
80
+ | 1.9195 | 1.5280 | 160000 | 1.4933 |
81
+ | 1.9195 | 1.5758 | 165000 | 1.4853 |
82
+ | 1.9195 | 1.6235 | 170000 | 1.4778 |
83
+ | 1.9195 | 1.6713 | 175000 | 1.4797 |
84
+ | 1.9195 | 1.7190 | 180000 | 1.4702 |
85
+ | 1.9195 | 1.7668 | 185000 | 1.4958 |
86
+ | 1.9195 | 1.8145 | 190000 | 1.4683 |
87
+ | 1.9195 | 1.8623 | 195000 | 1.4748 |
88
+ | 1.9195 | 1.9100 | 200000 | 1.4560 |
89
+ | 1.9195 | 1.9578 | 205000 | 1.4553 |
90
+ | 1.5744 | 2.0055 | 210000 | 1.4431 |
91
+ | 1.5744 | 2.0533 | 215000 | 1.4432 |
92
+ | 1.5744 | 2.1010 | 220000 | 1.4446 |
93
+ | 1.5744 | 2.1488 | 225000 | 1.4407 |
94
+ | 1.5744 | 2.1965 | 230000 | 1.4454 |
95
+ | 1.5744 | 2.2443 | 235000 | 1.4371 |
96
+ | 1.5744 | 2.2920 | 240000 | 1.4351 |
97
+ | 1.5744 | 2.3398 | 245000 | 1.4291 |
98
+ | 1.5744 | 2.3875 | 250000 | 1.4293 |
99
+ | 1.5744 | 2.4353 | 255000 | 1.4245 |
100
+ | 1.5744 | 2.4830 | 260000 | 1.4253 |
101
+ | 1.5744 | 2.5308 | 265000 | 1.4305 |
102
+ | 1.5744 | 2.5785 | 270000 | 1.4221 |
103
+ | 1.5744 | 2.6263 | 275000 | 1.4181 |
104
+ | 1.5744 | 2.6740 | 280000 | 1.4146 |
105
+ | 1.5744 | 2.7218 | 285000 | 1.4149 |
106
+ | 1.5744 | 2.7695 | 290000 | 1.4131 |
107
+ | 1.5744 | 2.8173 | 295000 | 1.4155 |
108
+ | 1.5744 | 2.8650 | 300000 | 1.4137 |
109
+ | 1.5744 | 2.9128 | 305000 | 1.4119 |
110
+ | 1.5744 | 2.9605 | 310000 | 1.4070 |
111
+
112
+
113
+ ### Framework versions
114
+
115
+ - Transformers 4.45.0
116
+ - Pytorch 2.4.1+cu121
117
+ - Datasets 3.0.1
118
+ - Tokenizers 0.20.0
generation_config.json ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "pad_token_id": 0,
4
+ "transformers_version": "4.45.0"
5
+ }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:703556beeb42badb585714aebb8f90109b2be927dd947831fc030d16d89fb4f2
3
  size 2024473756
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fc675e684f6fc26f10e3f601be06cd9631609673d79c8d92352499c4833a2e7b
3
  size 2024473756