ColleenMacklin
commited on
Commit
•
92b4430
1
Parent(s):
3cc307f
gpt-neo-125m-finetuned-philosopher_rave_100
Browse files
README.md
ADDED
@@ -0,0 +1,156 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: mit
|
3 |
+
base_model: EleutherAI/gpt-neo-125m
|
4 |
+
tags:
|
5 |
+
- generated_from_trainer
|
6 |
+
model-index:
|
7 |
+
- name: gpt-neo-125m-finetuned-philosopher_rave_100
|
8 |
+
results: []
|
9 |
+
---
|
10 |
+
|
11 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
12 |
+
should probably proofread and complete it, then remove this comment. -->
|
13 |
+
|
14 |
+
# gpt-neo-125m-finetuned-philosopher_rave_100
|
15 |
+
|
16 |
+
This model is a fine-tuned version of [EleutherAI/gpt-neo-125m](https://huggingface.co/EleutherAI/gpt-neo-125m) on an unknown dataset.
|
17 |
+
It achieves the following results on the evaluation set:
|
18 |
+
- Loss: 2.3681
|
19 |
+
|
20 |
+
## Model description
|
21 |
+
|
22 |
+
More information needed
|
23 |
+
|
24 |
+
## Intended uses & limitations
|
25 |
+
|
26 |
+
More information needed
|
27 |
+
|
28 |
+
## Training and evaluation data
|
29 |
+
|
30 |
+
More information needed
|
31 |
+
|
32 |
+
## Training procedure
|
33 |
+
|
34 |
+
### Training hyperparameters
|
35 |
+
|
36 |
+
The following hyperparameters were used during training:
|
37 |
+
- learning_rate: 3e-07
|
38 |
+
- train_batch_size: 8
|
39 |
+
- eval_batch_size: 8
|
40 |
+
- seed: 42
|
41 |
+
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
42 |
+
- lr_scheduler_type: linear
|
43 |
+
- num_epochs: 100.0
|
44 |
+
|
45 |
+
### Training results
|
46 |
+
|
47 |
+
| Training Loss | Epoch | Step | Validation Loss |
|
48 |
+
|:-------------:|:-----:|:-----:|:---------------:|
|
49 |
+
| No log | 1.0 | 155 | 2.6967 |
|
50 |
+
| No log | 2.0 | 310 | 2.6846 |
|
51 |
+
| No log | 3.0 | 465 | 2.6733 |
|
52 |
+
| 2.6891 | 4.0 | 620 | 2.6626 |
|
53 |
+
| 2.6891 | 5.0 | 775 | 2.6524 |
|
54 |
+
| 2.6891 | 6.0 | 930 | 2.6427 |
|
55 |
+
| 2.6569 | 7.0 | 1085 | 2.6336 |
|
56 |
+
| 2.6569 | 8.0 | 1240 | 2.6248 |
|
57 |
+
| 2.6569 | 9.0 | 1395 | 2.6164 |
|
58 |
+
| 2.6215 | 10.0 | 1550 | 2.6083 |
|
59 |
+
| 2.6215 | 11.0 | 1705 | 2.6005 |
|
60 |
+
| 2.6215 | 12.0 | 1860 | 2.5931 |
|
61 |
+
| 2.6022 | 13.0 | 2015 | 2.5858 |
|
62 |
+
| 2.6022 | 14.0 | 2170 | 2.5789 |
|
63 |
+
| 2.6022 | 15.0 | 2325 | 2.5721 |
|
64 |
+
| 2.6022 | 16.0 | 2480 | 2.5657 |
|
65 |
+
| 2.5777 | 17.0 | 2635 | 2.5594 |
|
66 |
+
| 2.5777 | 18.0 | 2790 | 2.5532 |
|
67 |
+
| 2.5777 | 19.0 | 2945 | 2.5473 |
|
68 |
+
| 2.5548 | 20.0 | 3100 | 2.5416 |
|
69 |
+
| 2.5548 | 21.0 | 3255 | 2.5360 |
|
70 |
+
| 2.5548 | 22.0 | 3410 | 2.5306 |
|
71 |
+
| 2.5359 | 23.0 | 3565 | 2.5253 |
|
72 |
+
| 2.5359 | 24.0 | 3720 | 2.5202 |
|
73 |
+
| 2.5359 | 25.0 | 3875 | 2.5152 |
|
74 |
+
| 2.5248 | 26.0 | 4030 | 2.5103 |
|
75 |
+
| 2.5248 | 27.0 | 4185 | 2.5056 |
|
76 |
+
| 2.5248 | 28.0 | 4340 | 2.5011 |
|
77 |
+
| 2.5248 | 29.0 | 4495 | 2.4966 |
|
78 |
+
| 2.5053 | 30.0 | 4650 | 2.4922 |
|
79 |
+
| 2.5053 | 31.0 | 4805 | 2.4880 |
|
80 |
+
| 2.5053 | 32.0 | 4960 | 2.4839 |
|
81 |
+
| 2.4871 | 33.0 | 5115 | 2.4798 |
|
82 |
+
| 2.4871 | 34.0 | 5270 | 2.4759 |
|
83 |
+
| 2.4871 | 35.0 | 5425 | 2.4721 |
|
84 |
+
| 2.4808 | 36.0 | 5580 | 2.4683 |
|
85 |
+
| 2.4808 | 37.0 | 5735 | 2.4647 |
|
86 |
+
| 2.4808 | 38.0 | 5890 | 2.4612 |
|
87 |
+
| 2.4659 | 39.0 | 6045 | 2.4577 |
|
88 |
+
| 2.4659 | 40.0 | 6200 | 2.4544 |
|
89 |
+
| 2.4659 | 41.0 | 6355 | 2.4511 |
|
90 |
+
| 2.4517 | 42.0 | 6510 | 2.4479 |
|
91 |
+
| 2.4517 | 43.0 | 6665 | 2.4447 |
|
92 |
+
| 2.4517 | 44.0 | 6820 | 2.4417 |
|
93 |
+
| 2.4517 | 45.0 | 6975 | 2.4387 |
|
94 |
+
| 2.4466 | 46.0 | 7130 | 2.4359 |
|
95 |
+
| 2.4466 | 47.0 | 7285 | 2.4330 |
|
96 |
+
| 2.4466 | 48.0 | 7440 | 2.4303 |
|
97 |
+
| 2.4348 | 49.0 | 7595 | 2.4276 |
|
98 |
+
| 2.4348 | 50.0 | 7750 | 2.4250 |
|
99 |
+
| 2.4348 | 51.0 | 7905 | 2.4225 |
|
100 |
+
| 2.4238 | 52.0 | 8060 | 2.4201 |
|
101 |
+
| 2.4238 | 53.0 | 8215 | 2.4177 |
|
102 |
+
| 2.4238 | 54.0 | 8370 | 2.4154 |
|
103 |
+
| 2.4172 | 55.0 | 8525 | 2.4131 |
|
104 |
+
| 2.4172 | 56.0 | 8680 | 2.4109 |
|
105 |
+
| 2.4172 | 57.0 | 8835 | 2.4088 |
|
106 |
+
| 2.4172 | 58.0 | 8990 | 2.4067 |
|
107 |
+
| 2.4097 | 59.0 | 9145 | 2.4047 |
|
108 |
+
| 2.4097 | 60.0 | 9300 | 2.4027 |
|
109 |
+
| 2.4097 | 61.0 | 9455 | 2.4008 |
|
110 |
+
| 2.4054 | 62.0 | 9610 | 2.3990 |
|
111 |
+
| 2.4054 | 63.0 | 9765 | 2.3972 |
|
112 |
+
| 2.4054 | 64.0 | 9920 | 2.3955 |
|
113 |
+
| 2.3936 | 65.0 | 10075 | 2.3938 |
|
114 |
+
| 2.3936 | 66.0 | 10230 | 2.3922 |
|
115 |
+
| 2.3936 | 67.0 | 10385 | 2.3906 |
|
116 |
+
| 2.394 | 68.0 | 10540 | 2.3891 |
|
117 |
+
| 2.394 | 69.0 | 10695 | 2.3877 |
|
118 |
+
| 2.394 | 70.0 | 10850 | 2.3863 |
|
119 |
+
| 2.387 | 71.0 | 11005 | 2.3850 |
|
120 |
+
| 2.387 | 72.0 | 11160 | 2.3837 |
|
121 |
+
| 2.387 | 73.0 | 11315 | 2.3824 |
|
122 |
+
| 2.387 | 74.0 | 11470 | 2.3813 |
|
123 |
+
| 2.3812 | 75.0 | 11625 | 2.3801 |
|
124 |
+
| 2.3812 | 76.0 | 11780 | 2.3791 |
|
125 |
+
| 2.3812 | 77.0 | 11935 | 2.3780 |
|
126 |
+
| 2.3812 | 78.0 | 12090 | 2.3771 |
|
127 |
+
| 2.3812 | 79.0 | 12245 | 2.3762 |
|
128 |
+
| 2.3812 | 80.0 | 12400 | 2.3753 |
|
129 |
+
| 2.3802 | 81.0 | 12555 | 2.3745 |
|
130 |
+
| 2.3802 | 82.0 | 12710 | 2.3737 |
|
131 |
+
| 2.3802 | 83.0 | 12865 | 2.3730 |
|
132 |
+
| 2.3687 | 84.0 | 13020 | 2.3723 |
|
133 |
+
| 2.3687 | 85.0 | 13175 | 2.3717 |
|
134 |
+
| 2.3687 | 86.0 | 13330 | 2.3711 |
|
135 |
+
| 2.3687 | 87.0 | 13485 | 2.3706 |
|
136 |
+
| 2.3722 | 88.0 | 13640 | 2.3702 |
|
137 |
+
| 2.3722 | 89.0 | 13795 | 2.3698 |
|
138 |
+
| 2.3722 | 90.0 | 13950 | 2.3694 |
|
139 |
+
| 2.3693 | 91.0 | 14105 | 2.3691 |
|
140 |
+
| 2.3693 | 92.0 | 14260 | 2.3688 |
|
141 |
+
| 2.3693 | 93.0 | 14415 | 2.3686 |
|
142 |
+
| 2.3654 | 94.0 | 14570 | 2.3684 |
|
143 |
+
| 2.3654 | 95.0 | 14725 | 2.3683 |
|
144 |
+
| 2.3654 | 96.0 | 14880 | 2.3682 |
|
145 |
+
| 2.372 | 97.0 | 15035 | 2.3682 |
|
146 |
+
| 2.372 | 98.0 | 15190 | 2.3681 |
|
147 |
+
| 2.372 | 99.0 | 15345 | 2.3681 |
|
148 |
+
| 2.3664 | 100.0 | 15500 | 2.3681 |
|
149 |
+
|
150 |
+
|
151 |
+
### Framework versions
|
152 |
+
|
153 |
+
- Transformers 4.39.3
|
154 |
+
- Pytorch 2.2.1+cu121
|
155 |
+
- Datasets 2.18.0
|
156 |
+
- Tokenizers 0.15.2
|
generation_config.json
ADDED
@@ -0,0 +1,6 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"_from_model_config": true,
|
3 |
+
"bos_token_id": 50256,
|
4 |
+
"eos_token_id": 50256,
|
5 |
+
"transformers_version": "4.39.3"
|
6 |
+
}
|
runs/Apr04_15-55-42_590c786a3313/events.out.tfevents.1712246171.590c786a3313.4137.4
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:4c12036b4b84c0acdb89a458aea50886aeb9ab0d8aa7ca52f1ce87f7a7fd43e9
|
3 |
+
size 39135
|
runs/Apr04_15-55-42_590c786a3313/events.out.tfevents.1712248561.590c786a3313.4137.5
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:a97efd511432913f962f15c7d1f2fefc89962918ec2108f183d18ed354a9336f
|
3 |
+
size 359
|