Update README.md
Browse files
README.md
CHANGED
@@ -14,6 +14,8 @@ library_name: transformers
|
|
14 |
|
15 |
π [arXiv]() | π€ [HuggingFace](https://huggingface.co/GeneZC/MiniMA-3B) ο½ π€ [ModelScope]()
|
16 |
|
|
|
|
|
17 |
A language model distilled from an adapted version of LLaMA2-7B following "Towards the Law of Capacity Gap in Distilling Language Models".
|
18 |
|
19 |
Establishing a new compute-performance pareto frontier.
|
|
|
14 |
|
15 |
π [arXiv]() | π€ [HuggingFace](https://huggingface.co/GeneZC/MiniMA-3B) ο½ π€ [ModelScope]()
|
16 |
|
17 |
+
:warning: Must comply with LICENSE of LLaMA2 since it's a model derived from LLaMA2.
|
18 |
+
|
19 |
A language model distilled from an adapted version of LLaMA2-7B following "Towards the Law of Capacity Gap in Distilling Language Models".
|
20 |
|
21 |
Establishing a new compute-performance pareto frontier.
|