Update README.md
Browse files
README.md
CHANGED
@@ -2,7 +2,10 @@
|
|
2 |
license: apache-2.0
|
3 |
---
|
4 |
##### Anonymouspro/Ttinyllama4b
|
5 |
-
|
|
|
|
|
|
|
6 |
|
7 |
The TinyLlama project aims to pretrain a 4B Llama model on 3 trillion tokens. With some proper optimization, we can achieve this within a span of "just" 90 days using 16 A100-40G GPUs ππ. The training has started on 2023-09-01.
|
8 |
|
|
|
2 |
license: apache-2.0
|
3 |
---
|
4 |
##### Anonymouspro/Ttinyllama4b
|
5 |
+
contact whatsap:- +8801622951671
|
6 |
+
this module supported old andeoid phn,, without high CPU
|
7 |
+
no need GPU
|
8 |
+
only normal phn run 4B models tinyllama
|
9 |
|
10 |
The TinyLlama project aims to pretrain a 4B Llama model on 3 trillion tokens. With some proper optimization, we can achieve this within a span of "just" 90 days using 16 A100-40G GPUs ππ. The training has started on 2023-09-01.
|
11 |
|