Yudhanjaya
commited on
Commit
•
014151e
1
Parent(s):
486cf71
Update README.md
Browse files
README.md
CHANGED
@@ -43,7 +43,28 @@ Below are the results of Vicuna-style testing: 80 questions in various categorie
|
|
43 |
|
44 |
(A sheet of questions, answers and GPT's reviews are also included in this repo).
|
45 |
|
46 |
-
Because of its small size, Eluwa can be used as research into conversational models with older and slower hardware.
|
47 |
-
|
48 |
|
|
|
49 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
43 |
|
44 |
(A sheet of questions, answers and GPT's reviews are also included in this repo).
|
45 |
|
46 |
+
Because of its small size, Eluwa can be used as research into conversational models with older and slower hardware.
|
47 |
+
## Using Eluwa
|
48 |
|
49 |
+
I used [oobabooga's text generation UI](https://github.com/oobabooga/text-generation-webui) for testing, because it lets me easily regenerate outputs, modify the conversation history passed to the model, and mess with parameters.
|
50 |
|
51 |
+
To load Eluwa, download [OPT 2.7b from Huggingface](https://huggingface.co/facebook/opt-2.7b) and download both the .bin and .json file from the /model folder on this Github. Follow the instructions on the text generation UI repository to figure out where the model goes and how to load a LoRA. Eluwa goes in the /loras folder.
|
52 |
+
|
53 |
+
## Training and notes
|
54 |
+
|
55 |
+
Training Eluwa is a straightforward process. It is essentially Facebook's GPT-like OPT 2.7b model, loaded in 8-bit and trained using [Stanford's Alapaca dataset](https://github.com/tatsu-lab/stanford_alpaca). Use the [Colab notebook here](https://colab.research.google.com/drive/1rkLx0oI8pbix0EznjYeaLDqPoMHdw0x8?usp=sharing). I've written notes in there on what the functions do.
|
56 |
+
|
57 |
+
When loaded thusly, OPT 2.7b gives us 5242880 trainable params out of a total 2656839680 (trainable%: 0.19733520390662038).
|
58 |
+
|
59 |
+
## Why "Eluwa"?
|
60 |
+
|
61 |
+
Well, the whole thing was inspiration from Alpaca, which is a LoRA based on Llama. Others adopted the trend (Cabrita, Vicuna etc). Now, in Sri Lanka, we don't have llamas (at least, I've never seen any), but we do have goats. Goats are spectacular animals. In Ragama I once beheld a goat fighting a pack of stray dogs (and winning). Then it came for me. I hit it on the head with my umbrella, whereupon which it ate the umbrella and chased me the length and breadth of the entire village.
|
62 |
+
|
63 |
+
If you can't beat em, join em. "Eluwa" means goat. Goats are fearsome, versatile, and double as the essential ingredient in mutton rolls. Everything in the known universe is either a goat, or not a goat. They're not as nice as llamas or alpacas, but they'll do.
|
64 |
+
|
65 |
+
## License
|
66 |
+
|
67 |
+
Facebook's OPT has [its own license. Please read it here.](https://github.com/facebookresearch/metaseq/blob/main/projects/OPT/MODEL_LICENSE.md)
|
68 |
+
Alpaca is licensed for research use only. The dataset is CC BY NC 4.0 (allowing only non-commercial use) and they note that models trained using the dataset should not be used outside of research purposes.
|
69 |
+
|
70 |
+
Eluwa, therefore, is only for research and non-commercial use, under CC BY NC 4.0. Go experiment with it, but don't use it commercially.
|