cataluna84
commited on
Commit
•
d63f2a7
1
Parent(s):
a2fa6ef
Update README.md
Browse files
README.md
CHANGED
@@ -11,12 +11,27 @@ Multilingual language models are typically large, requiring significant computat
|
|
11 |
|
12 |
Can we create multilingual models that maintain performance comparable to their larger models while reducing size, latency and inference speeds?
|
13 |
|
14 |
-
|
15 |
- Pruning
|
16 |
- SparseGPT | [GitHub](https://github.com/VishnuVardhanSaiLanka/sparsegpt/tree/aya)
|
17 |
-
- ShortGPT
|
18 |
- Knowledge Distillation
|
19 |
- DistillKit | [GitHub](https://github.com/ShayekhBinIslam/DistillKit)
|
|
|
|
|
|
|
|
|
20 |
- Quantization
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
21 |
|
22 |
|
|
|
11 |
|
12 |
Can we create multilingual models that maintain performance comparable to their larger models while reducing size, latency and inference speeds?
|
13 |
|
14 |
+
Techniques:
|
15 |
- Pruning
|
16 |
- SparseGPT | [GitHub](https://github.com/VishnuVardhanSaiLanka/sparsegpt/tree/aya)
|
17 |
+
- ShortGPT | [Perplexity Sensivities](https://github.com/rsk2327/DistAya/tree/main)
|
18 |
- Knowledge Distillation
|
19 |
- DistillKit | [GitHub](https://github.com/ShayekhBinIslam/DistillKit)
|
20 |
+
- Distil-Whisper based method
|
21 |
+
- On policy distillation of language models
|
22 |
+
- Minitron: Compact Language models via Pruning & Knowledge Distillation
|
23 |
+
- DistiLLM: Towards Streamlined Distillation for Large Language Models
|
24 |
- Quantization
|
25 |
+
- Fine-Tuning | [GitHub](https://github.com/rsk2327/DistAya/tree/track/fine-tuning)
|
26 |
+
|
27 |
+
Dataset:
|
28 |
+
Initial 7 datasets unified, having 6.62M rows which includes the following:
|
29 |
+
- Bangla_Alpaca_Orca : Bangle
|
30 |
+
- Urdu_Instruct_News_Article_Generation: Urdu
|
31 |
+
- Urdu_Instruct_News_Headline_Generation: Urdu
|
32 |
+
- Urdu_Instruct_News_Category_Classification: Urdu
|
33 |
+
- cidar: Arabic
|
34 |
+
- Six_Millions_Instruction_Dataset_For_Arabic_Llm_Ft: Arabic
|
35 |
+
- instructv3: English
|
36 |
|
37 |
|