Update README.md
Browse files
README.md
CHANGED
@@ -38,7 +38,9 @@ Don't worry even if you don't get the results you want.
|
|
38 |
|
39 |
I'll find the answer for you.
|
40 |
|
41 |
-
Soon PoSE to extend Llama's context length to 64k with using my merge method : "reborn"[reborn](https://medium.com/@puffanddmx82/reborn-elevating-model-adaptation-with-merging-for-superior-nlp-performance-f604e8e307b2)
|
|
|
|
|
42 |
|
43 |
256k is not possible. My computer is running out of memory. If you support me, i will try it on a computer with maximum specifications.
|
44 |
|
|
|
38 |
|
39 |
I'll find the answer for you.
|
40 |
|
41 |
+
Soon real PoSE to extend Llama's context length to 64k with using my merge method : "reborn"[reborn](https://medium.com/@puffanddmx82/reborn-elevating-model-adaptation-with-merging-for-superior-nlp-performance-f604e8e307b2)
|
42 |
+
|
43 |
+
I have found that most merges so far do not actually have 64k in their configs. I will improve it in the next merge.
|
44 |
|
45 |
256k is not possible. My computer is running out of memory. If you support me, i will try it on a computer with maximum specifications.
|
46 |
|