sometimesanotion commited on
Commit
db30dbe
1 Parent(s): a7a2697

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -2
README.md CHANGED
@@ -44,8 +44,7 @@ The first two layers come entirely from Virtuoso. The choice to leave these lay
44
 
45
  - @arcee-ai's team for the ever-capable mergekit, and the exceptional Virtuoso Small model.
46
  - @CultriX for the helpful examples of memory-efficient sliced merges and evolutionary merging. Their contribution of tinyevals on version 0.1 of Lamarck did much to validate the hypotheses of the DELLA->SLERP gradient process used here.
47
- - The authors behind the capable models that appear in the model_stock. The boost to prose quality is already noticeable.
48
-
49
  ### Models Merged
50
 
51
  **Top influences:** These ancestors are base models and present in the model_stocks, but are heavily re-emphasized in the DELLA and SLERP merges.
 
44
 
45
  - @arcee-ai's team for the ever-capable mergekit, and the exceptional Virtuoso Small model.
46
  - @CultriX for the helpful examples of memory-efficient sliced merges and evolutionary merging. Their contribution of tinyevals on version 0.1 of Lamarck did much to validate the hypotheses of the DELLA->SLERP gradient process used here.
47
+ - The authors behind the capable models that appear in the model_stock.
 
48
  ### Models Merged
49
 
50
  **Top influences:** These ancestors are base models and present in the model_stocks, but are heavily re-emphasized in the DELLA and SLERP merges.