jyhong836 commited on
Commit
61199cd
·
1 Parent(s): 61536cb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +28 -0
README.md CHANGED
@@ -10,3 +10,31 @@ pinned: false
10
  # Visual Informatics Group @ University of Texas at Austin
11
 
12
  At VITA group, we have unusually broad, and forever-evolving research interests spanning from the theory to the application aspects of machine learning (ML). Our current "research keywords" include, but are not limited to: sparsity (from classical optimization to modern neural networks); efficient training, inference or transfer (especially, of large foundation models); robustness and trustworthiness; learning to optimize (L2O); generative AI; graph learning, and more.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10
  # Visual Informatics Group @ University of Texas at Austin
11
 
12
  At VITA group, we have unusually broad, and forever-evolving research interests spanning from the theory to the application aspects of machine learning (ML). Our current "research keywords" include, but are not limited to: sparsity (from classical optimization to modern neural networks); efficient training, inference or transfer (especially, of large foundation models); robustness and trustworthiness; learning to optimize (L2O); generative AI; graph learning, and more.
13
+
14
+
15
+ ## Compressed LLM Model Zone
16
+
17
+ The models are prepared by [Visual Informatics Group @ University of Texas at Austin (VITA-group)](https://vita-group.github.io/).
18
+
19
+ License: [MIT License](https://opensource.org/license/mit/)
20
+
21
+
22
+ | | Base Model | Model Size | Compression Method | Compression Degree |
23
+ |---:|:-------------|:-------------|:-----------------------|:--------------------------------------------------------------------------------------|
24
+ | 0 | Llama-2 | 7b | magnitude_unstructured | [s0.1](https://huggingface.co/vita-group/comp-llama-2-7b_magnitude_unstructured_s0.1) |
25
+ | 1 | Llama-2 | 7b | magnitude_unstructured | [s0.2](https://huggingface.co/vita-group/comp-llama-2-7b_magnitude_unstructured_s0.2) |
26
+ | 2 | Llama-2 | 7b | magnitude_unstructured | [s0.3](https://huggingface.co/vita-group/comp-llama-2-7b_magnitude_unstructured_s0.3) |
27
+ | 3 | Llama-2 | 7b | magnitude_unstructured | [s0.5](https://huggingface.co/vita-group/comp-llama-2-7b_magnitude_unstructured_s0.5) |
28
+ | 4 | Llama-2 | 7b | magnitude_unstructured | [s0.6](https://huggingface.co/vita-group/comp-llama-2-7b_magnitude_unstructured_s0.6) |
29
+ | 5 | Llama-2 | 7b | sparsegpt_unstructured | [s0.1](https://huggingface.co/vita-group/comp-llama-2-7b_sparsegpt_unstructured_s0.1) |
30
+ | 6 | Llama-2 | 7b | sparsegpt_unstructured | [s0.2](https://huggingface.co/vita-group/comp-llama-2-7b_sparsegpt_unstructured_s0.2) |
31
+ | 7 | Llama-2 | 7b | sparsegpt_unstructured | [s0.3](https://huggingface.co/vita-group/comp-llama-2-7b_sparsegpt_unstructured_s0.3) |
32
+ | 8 | Llama-2 | 7b | sparsegpt_unstructured | [s0.5](https://huggingface.co/vita-group/comp-llama-2-7b_sparsegpt_unstructured_s0.5) |
33
+ | 9 | Llama-2 | 7b | sparsegpt_unstructured | [s0.6](https://huggingface.co/vita-group/comp-llama-2-7b_sparsegpt_unstructured_s0.6) |
34
+ | 10 | Llama-2 | 7b | wanda_unstructured | [s0.1](https://huggingface.co/vita-group/comp-llama-2-7b_wanda_unstructured_s0.1) |
35
+ | 11 | Llama-2 | 7b | wanda_unstructured | [s0.2](https://huggingface.co/vita-group/comp-llama-2-7b_wanda_unstructured_s0.2) |
36
+ | 12 | Llama-2 | 7b | wanda_unstructured | [s0.3](https://huggingface.co/vita-group/comp-llama-2-7b_wanda_unstructured_s0.3) |
37
+ | 13 | Llama-2 | 7b | wanda_unstructured | [s0.5](https://huggingface.co/vita-group/comp-llama-2-7b_wanda_unstructured_s0.5) |
38
+ | 14 | Llama-2 | 7b | wanda_unstructured | [s0.6](https://huggingface.co/vita-group/comp-llama-2-7b_wanda_unstructured_s0.6) |
39
+
40
+