Sandiago21 commited on
Commit
af3e171
1 Parent(s): f45f66c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -6
README.md CHANGED
@@ -96,7 +96,7 @@ def generate_prompt(instruction: str, input_ctxt: str = None) -> str:
96
 
97
  Use the code below to get started with the model.
98
 
99
- 1. You can directly call the model from HuggingFace using the following code snippet:
100
 
101
  ```python
102
  import torch
@@ -104,12 +104,14 @@ from peft import PeftConfig, PeftModel
104
  from transformers import GenerationConfig, LlamaTokenizer, LlamaForCausalLM
105
 
106
  MODEL_NAME = "Sandiago21/llama-7b-hf-prompt-answering"
107
- BASE_MODEL = "decapoda-research/llama-7b-hf"
108
 
109
  config = PeftConfig.from_pretrained(MODEL_NAME)
110
 
 
 
 
111
  model = LlamaForCausalLM.from_pretrained(
112
- BASE_MODEL,
113
  load_in_8bit=True,
114
  torch_dtype=torch.float16,
115
  device_map="auto",
@@ -133,7 +135,6 @@ if torch.__version__ >= "2":
133
  ```
134
 
135
  ### Example of Usage
136
-
137
  ```python
138
  instruction = "What is the capital city of Greece and with which countries does Greece border?"
139
  input_ctxt = None # For some tasks, you can provide an input context to help the model generate a better response.
@@ -156,7 +157,7 @@ print(response)
156
  >>> The capital city of Greece is Athens and it borders Turkey, Bulgaria, Macedonia, Albania, and the Aegean Sea.
157
  ```
158
 
159
- 2. You can git clone the repo, which contains also the artifacts for the base model for simplicity and completeness, and run the following code snippet to load the mode:
160
 
161
  ```python
162
  import torch
@@ -164,11 +165,12 @@ from peft import PeftConfig, PeftModel
164
  from transformers import GenerationConfig, LlamaTokenizer, LlamaForCausalLM
165
 
166
  MODEL_NAME = "Sandiago21/llama-7b-hf-prompt-answering"
 
167
 
168
  config = PeftConfig.from_pretrained(MODEL_NAME)
169
 
170
  model = LlamaForCausalLM.from_pretrained(
171
- config.base_model_name_or_path,
172
  load_in_8bit=True,
173
  torch_dtype=torch.float16,
174
  device_map="auto",
@@ -192,6 +194,7 @@ if torch.__version__ >= "2":
192
  ```
193
 
194
  ### Example of Usage
 
195
  ```python
196
  instruction = "What is the capital city of Greece and with which countries does Greece border?"
197
  input_ctxt = None # For some tasks, you can provide an input context to help the model generate a better response.
@@ -214,6 +217,7 @@ print(response)
214
  >>> The capital city of Greece is Athens and it borders Turkey, Bulgaria, Macedonia, Albania, and the Aegean Sea.
215
  ```
216
 
 
217
  ## Training Details
218
 
219
  ## Training procedure
 
96
 
97
  Use the code below to get started with the model.
98
 
99
+ 1. You can git clone the repo, which contains also the artifacts for the base model for simplicity and completeness, and run the following code snippet to load the mode:
100
 
101
  ```python
102
  import torch
 
104
  from transformers import GenerationConfig, LlamaTokenizer, LlamaForCausalLM
105
 
106
  MODEL_NAME = "Sandiago21/llama-7b-hf-prompt-answering"
 
107
 
108
  config = PeftConfig.from_pretrained(MODEL_NAME)
109
 
110
+ # Setting the path to look at your repo directory, assuming that you are at that directory when running this script
111
+ config.base_model_name_or_path = "decapoda-research/llama-7b-hf/"
112
+
113
  model = LlamaForCausalLM.from_pretrained(
114
+ config.base_model_name_or_path,
115
  load_in_8bit=True,
116
  torch_dtype=torch.float16,
117
  device_map="auto",
 
135
  ```
136
 
137
  ### Example of Usage
 
138
  ```python
139
  instruction = "What is the capital city of Greece and with which countries does Greece border?"
140
  input_ctxt = None # For some tasks, you can provide an input context to help the model generate a better response.
 
157
  >>> The capital city of Greece is Athens and it borders Turkey, Bulgaria, Macedonia, Albania, and the Aegean Sea.
158
  ```
159
 
160
+ 2. You can directly call the model from HuggingFace using the following code snippet:
161
 
162
  ```python
163
  import torch
 
165
  from transformers import GenerationConfig, LlamaTokenizer, LlamaForCausalLM
166
 
167
  MODEL_NAME = "Sandiago21/llama-7b-hf-prompt-answering"
168
+ BASE_MODEL = "decapoda-research/llama-7b-hf"
169
 
170
  config = PeftConfig.from_pretrained(MODEL_NAME)
171
 
172
  model = LlamaForCausalLM.from_pretrained(
173
+ BASE_MODEL,
174
  load_in_8bit=True,
175
  torch_dtype=torch.float16,
176
  device_map="auto",
 
194
  ```
195
 
196
  ### Example of Usage
197
+
198
  ```python
199
  instruction = "What is the capital city of Greece and with which countries does Greece border?"
200
  input_ctxt = None # For some tasks, you can provide an input context to help the model generate a better response.
 
217
  >>> The capital city of Greece is Athens and it borders Turkey, Bulgaria, Macedonia, Albania, and the Aegean Sea.
218
  ```
219
 
220
+
221
  ## Training Details
222
 
223
  ## Training procedure