Update README.md
Browse files
README.md
CHANGED
@@ -127,8 +127,8 @@ The final evaluation cross-entropy ended around 0.4 for this model.
|
|
127 |
|
128 |
|
129 |
|
130 |
-
| | Loss on Llama 3.1 fine tuning | Notice
|
131 |
-
|
132 |
| **LORA** | 0.4603 | |
|
133 |
| **LORA+** | 0.4011 | The model uploaded here |
|
134 |
| **DORA**| 0.4182 | |
|
|
|
127 |
|
128 |
|
129 |
|
130 |
+
| | Loss on Llama 3.1 fine tuning | Notice |
|
131 |
+
|:------------------|:---------------------------|:-----------|
|
132 |
| **LORA** | 0.4603 | |
|
133 |
| **LORA+** | 0.4011 | The model uploaded here |
|
134 |
| **DORA**| 0.4182 | |
|