nenad1002 commited on
Commit
e84ae8b
1 Parent(s): c05f8b8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -125,7 +125,7 @@ Please see the graph below:
125
 
126
  The final evaluation cross-entropy ended around 0.4 for this model.
127
 
128
- The table below shows the best cross-entropy (across all params) for each of the techniques applied. Without the embedding nodes included, the results were usually worse for up to 0.1.
129
 
130
 
131
  | | Loss on Llama 3.1 fine tuning | Notice |
 
125
 
126
  The final evaluation cross-entropy ended around 0.4 for this model.
127
 
128
+ The table below shows the best evaluation cross-entropy (across all params) for each of the techniques applied. Without the embedding nodes included, the results were usually worse for up to 0.1.
129
 
130
 
131
  | | Loss on Llama 3.1 fine tuning | Notice |