bofenghuang
commited on
Commit
•
41079c1
1
Parent(s):
a78d58e
up
Browse files- README.md +7 -4
- tokenizer.model +3 -0
README.md
CHANGED
@@ -19,7 +19,7 @@ inference: false
|
|
19 |
|
20 |
# Vigogne-LoRA-7b: A French Instruct LLaMA Model
|
21 |
|
22 |
-
Vigogne-LoRA-7b is a
|
23 |
|
24 |
For more information, please visit the Github repo: https://github.com/bofenghuang/vigogne
|
25 |
|
@@ -33,13 +33,16 @@ This repo only contains the low-rank adapter. In order to access the complete mo
|
|
33 |
from peft import PeftModel
|
34 |
from transformers import LlamaForCausalLM, LlamaTokenizer
|
35 |
|
36 |
-
|
|
|
|
|
|
|
37 |
model = LlamaForCausalLM.from_pretrained(
|
38 |
-
|
39 |
load_in_8bit=True,
|
40 |
device_map="auto",
|
41 |
)
|
42 |
-
model = PeftModel.from_pretrained(model,
|
43 |
```
|
44 |
|
45 |
You can infer this model by using the following Google Colab Notebook.
|
|
|
19 |
|
20 |
# Vigogne-LoRA-7b: A French Instruct LLaMA Model
|
21 |
|
22 |
+
Vigogne-LoRA-7b is a LLaMA-7B model fine-tuned on the translated [Stanford Alpaca](https://github.com/tatsu-lab/stanford_alpaca) dataset to follow the 🇫🇷 French instructions.
|
23 |
|
24 |
For more information, please visit the Github repo: https://github.com/bofenghuang/vigogne
|
25 |
|
|
|
33 |
from peft import PeftModel
|
34 |
from transformers import LlamaForCausalLM, LlamaTokenizer
|
35 |
|
36 |
+
base_model_name_or_path = "<name/or/path/to/hf/llama/7b/model>"
|
37 |
+
lora_model_name_or_path = "bofenghuang/vigogne-lora-7b"
|
38 |
+
|
39 |
+
tokenizer = LlamaTokenizer.from_pretrained(base_model_name_or_path)
|
40 |
model = LlamaForCausalLM.from_pretrained(
|
41 |
+
base_model_name_or_path,
|
42 |
load_in_8bit=True,
|
43 |
device_map="auto",
|
44 |
)
|
45 |
+
model = PeftModel.from_pretrained(model, lora_model_name_or_path)
|
46 |
```
|
47 |
|
48 |
You can infer this model by using the following Google Colab Notebook.
|
tokenizer.model
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:9e556afd44213b6bd1be2b850ebbbd98f5481437a8021afaf58ee7fb1818d347
|
3 |
+
size 499723
|