Sandiago21 commited on
Commit
f45f66c
1 Parent(s): 108c3ae

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -6
README.md CHANGED
@@ -96,7 +96,7 @@ def generate_prompt(instruction: str, input_ctxt: str = None) -> str:
96
 
97
  Use the code below to get started with the model.
98
 
99
- 1. You can git clone the repo, which contains also the artifacts for the base model for simplicity and completeness, and run the following code snippet to load the mode:
100
 
101
  ```python
102
  import torch
@@ -104,11 +104,12 @@ from peft import PeftConfig, PeftModel
104
  from transformers import GenerationConfig, LlamaTokenizer, LlamaForCausalLM
105
 
106
  MODEL_NAME = "Sandiago21/llama-7b-hf-prompt-answering"
 
107
 
108
  config = PeftConfig.from_pretrained(MODEL_NAME)
109
 
110
  model = LlamaForCausalLM.from_pretrained(
111
- config.base_model_name_or_path,
112
  load_in_8bit=True,
113
  torch_dtype=torch.float16,
114
  device_map="auto",
@@ -132,6 +133,7 @@ if torch.__version__ >= "2":
132
  ```
133
 
134
  ### Example of Usage
 
135
  ```python
136
  instruction = "What is the capital city of Greece and with which countries does Greece border?"
137
  input_ctxt = None # For some tasks, you can provide an input context to help the model generate a better response.
@@ -154,7 +156,7 @@ print(response)
154
  >>> The capital city of Greece is Athens and it borders Turkey, Bulgaria, Macedonia, Albania, and the Aegean Sea.
155
  ```
156
 
157
- 2. You can also directly call the model from HuggingFace using the following code snippet:
158
 
159
  ```python
160
  import torch
@@ -162,12 +164,11 @@ from peft import PeftConfig, PeftModel
162
  from transformers import GenerationConfig, LlamaTokenizer, LlamaForCausalLM
163
 
164
  MODEL_NAME = "Sandiago21/llama-7b-hf-prompt-answering"
165
- BASE_MODEL = "decapoda-research/llama-7b-hf"
166
 
167
  config = PeftConfig.from_pretrained(MODEL_NAME)
168
 
169
  model = LlamaForCausalLM.from_pretrained(
170
- BASE_MODEL,
171
  load_in_8bit=True,
172
  torch_dtype=torch.float16,
173
  device_map="auto",
@@ -191,7 +192,6 @@ if torch.__version__ >= "2":
191
  ```
192
 
193
  ### Example of Usage
194
-
195
  ```python
196
  instruction = "What is the capital city of Greece and with which countries does Greece border?"
197
  input_ctxt = None # For some tasks, you can provide an input context to help the model generate a better response.
 
96
 
97
  Use the code below to get started with the model.
98
 
99
+ 1. You can directly call the model from HuggingFace using the following code snippet:
100
 
101
  ```python
102
  import torch
 
104
  from transformers import GenerationConfig, LlamaTokenizer, LlamaForCausalLM
105
 
106
  MODEL_NAME = "Sandiago21/llama-7b-hf-prompt-answering"
107
+ BASE_MODEL = "decapoda-research/llama-7b-hf"
108
 
109
  config = PeftConfig.from_pretrained(MODEL_NAME)
110
 
111
  model = LlamaForCausalLM.from_pretrained(
112
+ BASE_MODEL,
113
  load_in_8bit=True,
114
  torch_dtype=torch.float16,
115
  device_map="auto",
 
133
  ```
134
 
135
  ### Example of Usage
136
+
137
  ```python
138
  instruction = "What is the capital city of Greece and with which countries does Greece border?"
139
  input_ctxt = None # For some tasks, you can provide an input context to help the model generate a better response.
 
156
  >>> The capital city of Greece is Athens and it borders Turkey, Bulgaria, Macedonia, Albania, and the Aegean Sea.
157
  ```
158
 
159
+ 2. You can git clone the repo, which contains also the artifacts for the base model for simplicity and completeness, and run the following code snippet to load the mode:
160
 
161
  ```python
162
  import torch
 
164
  from transformers import GenerationConfig, LlamaTokenizer, LlamaForCausalLM
165
 
166
  MODEL_NAME = "Sandiago21/llama-7b-hf-prompt-answering"
 
167
 
168
  config = PeftConfig.from_pretrained(MODEL_NAME)
169
 
170
  model = LlamaForCausalLM.from_pretrained(
171
+ config.base_model_name_or_path,
172
  load_in_8bit=True,
173
  torch_dtype=torch.float16,
174
  device_map="auto",
 
192
  ```
193
 
194
  ### Example of Usage
 
195
  ```python
196
  instruction = "What is the capital city of Greece and with which countries does Greece border?"
197
  input_ctxt = None # For some tasks, you can provide an input context to help the model generate a better response.