aashish1904 commited on
Commit
c763418
1 Parent(s): 6270857

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +56 -0
README.md ADDED
@@ -0,0 +1,56 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ ---
3
+
4
+ base_model: unsloth/gemma-2-2b-it-bnb-4bit
5
+ language:
6
+ - en
7
+ license: apache-2.0
8
+ tags:
9
+ - text-generation-inference
10
+ - transformers
11
+ - unsloth
12
+ - gemma2
13
+ - trl
14
+ - dpo
15
+
16
+ ---
17
+
18
+ [![QuantFactory Banner](https://lh7-rt.googleusercontent.com/docsz/AD_4nXeiuCm7c8lEwEJuRey9kiVZsRn2W-b4pWlu3-X534V3YmVuVc2ZL-NXg2RkzSOOS2JXGHutDuyyNAUtdJI65jGTo8jT9Y99tMi4H4MqL44Uc5QKG77B0d6-JfIkZHFaUA71-RtjyYZWVIhqsNZcx8-OMaA?key=xt3VSDoCbmTY7o-cwwOFwQ)](https://hf.co/QuantFactory)
19
+
20
+
21
+ # QuantFactory/gemma-2-2b-it-Flight-Multi-Turn-V3-DPO-GGUF
22
+ This is quantized version of [SameedHussain/gemma-2-2b-it-Flight-Multi-Turn-V3-DPO](https://huggingface.co/SameedHussain/gemma-2-2b-it-Flight-Multi-Turn-V3-DPO) created using llama.cpp
23
+
24
+ # Original Model Card
25
+
26
+
27
+ # Uploaded model
28
+
29
+ - **Developed by:** SameedHussain
30
+ - **License:** apache-2.0
31
+ - **Finetuned from model :** unsloth/gemma-2-2b-it-bnb-4bit
32
+
33
+ This gemma2 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
34
+
35
+ [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
36
+
37
+ | Step | Training Loss | Rewards / Chosen | Rewards / Rejected | Rewards / Accuracies | Rewards / Margins | Logps / Rejected | Logps / Chosen | Logits / Rejected | Logits / Chosen |
38
+ |------|---------------|------------------|--------------------|----------------------|-------------------|------------------|----------------|-------------------|-----------------|
39
+ | 100 | 0.454700 | 6.241566 | 3.175092 | 0.750000 | 3.066474 | -102.758446 | -53.181263 | -14.580903 | -14.938275 |
40
+ | 200 | 0.264100 | 6.640531 | 2.823826 | 0.888750 | 3.816705 | -110.525520 | -50.815018 | -14.796252 | -15.198202 |
41
+ | 300 | 0.110200 | 6.310797 | 1.718347 | 0.985000 | 4.592450 | -118.720840 | -48.524315 | -15.263680 | -15.698647 |
42
+ | 400 | 0.046900 | 6.744057 | 0.677384 | 0.997500 | 6.066672 | -128.757660 | -48.107479 | -15.710546 | -16.174524 |
43
+ | 500 | 0.019700 | 6.714230 | -0.529035 | 1.000000 | 7.243264 | -143.408020 | -49.327625 | -16.120342 | -16.611662 |
44
+ | 600 | 0.013700 | 6.605389 | -1.275738 | 1.000000 | 7.881127 | -146.968491 | -48.847641 | -16.320650 | -16.836390 |
45
+ | 700 | 0.007900 | 6.333577 | -2.010140 | 1.000000 | 8.343716 | -154.255066 | -50.590134 | -16.486574 | -16.987421 |
46
+ | 800 | 0.006300 | 6.489099 | -2.076626 | 1.000000 | 8.565723 | -150.381393 | -49.992256 | -16.614525 | -17.117744 |
47
+ | 900 | 0.005100 | 6.429256 | -2.340122 | 1.000000 | 8.769380 | -160.874405 | -51.164425 | -16.687891 | -17.165791 |
48
+ | 1000 | 0.004700 | 6.494193 | -2.520164 | 1.000000 | 9.014358 | -163.852982 | -54.317467 | -16.757954 | -17.206339 |
49
+ | 1100 | 0.005900 | 6.287598 | -2.524287 | 1.000000 | 8.811884 | -161.473770 | -52.012741 | -16.825716 | -17.266563 |
50
+ | 1200 | 0.005200 | 6.246828 | -3.126722 | 0.998750 | 9.373549 | -167.766861 | -52.052780 | -16.795412 | -17.277397 |
51
+ | 1300 | 0.004300 | 6.347938 | -2.930621 | 1.000000 | 9.278559 | -165.971939 | -50.738480 | -16.836918 | -17.304783 |
52
+ | 1400 | 0.003900 | 6.232501 | -3.073614 | 1.000000 | 9.306114 | -165.787643 | -50.953049 | -16.813383 | -17.290031 |
53
+
54
+
55
+
56
+