nenad1002 commited on
Commit
c05f8b8
1 Parent(s): 603b4c9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -125,7 +125,7 @@ Please see the graph below:
125
 
126
  The final evaluation cross-entropy ended around 0.4 for this model.
127
 
128
- The table below shows the cross-entropies for each technique applied when the embedding training was present. Without the embedding, the results were usually worse for up to 0.1.
129
 
130
 
131
  | | Loss on Llama 3.1 fine tuning | Notice |
 
125
 
126
  The final evaluation cross-entropy ended around 0.4 for this model.
127
 
128
+ The table below shows the best cross-entropy (across all params) for each of the techniques applied. Without the embedding nodes included, the results were usually worse for up to 0.1.
129
 
130
 
131
  | | Loss on Llama 3.1 fine tuning | Notice |