Sandiago21's picture
Update README.md
13d8a16
|
raw
history blame
5.56 kB
---
license: other
language:
- en
library_name: transformers
pipeline_tag: text-generation
tags:
- llama
- decapoda-research-7b-hf
- prompt answering
- peft
---
## Model Card for Model ID
This repository contains a LLaMA-7B further fine-tuned model on conversations and question answering prompts.
This model is a fine-tuned version of [chainyo/alpaca-lora-7b](https://huggingface.co/chainyo/alpaca-lora-7b) on conversations dataset.
⚠️ **I used [LLaMA-7b-hf](https://huggingface.co/decapoda-research/llama-7b-hf) as a base model, so this model is for Research purpose only (See the [license](https://huggingface.co/decapoda-research/llama-7b-hf/blob/main/LICENSE))**
## Model Details
### Model Description
The decapoda-research/llama-7b-hf model was finetuned on conversations and question answering prompts.
**Developed by:** [More Information Needed]
**Shared by:** [More Information Needed]
**Model type:** Causal LM
**Language(s) (NLP):** English, multilingual
**License:** Research
**Finetuned from model:** decapoda-research/llama-7b-hf
## Model Sources [optional]
**Repository:** [More Information Needed]
**Paper:** [More Information Needed]
**Demo:** [More Information Needed]
## Uses
The model can be used for prompt answering
### Direct Use
The model can be used for prompt answering
### Downstream Use
Generating text and prompt answering
## Recommendations
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
# Usage
## Creating prompt
The model was trained on the following kind of prompt:
```python
def generate_prompt(instruction: str, input_ctxt: str = None) -> str:
if input_ctxt:
return f"""Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.
### Instruction:
{instruction}
### Input:
{input_ctxt}
### Response:"""
else:
return f"""Below is an instruction that describes a task. Write a response that appropriately completes the request.
### Instruction:
{instruction}
### Response:"""
```
## How to Get Started with the Model
Use the code below to get started with the model.
```python
from transformers import LlamaTokenizer, LlamaForCausalLM
from peft import PeftModel
MODEL_NAME = "decapoda-research/llama-7b-hf"
tokenizer = LlamaTokenizer.from_pretrained(MODEL_NAME, add_eos_token=True)
tokenizer.pad_token_id = 0
model = LlamaForCausalLM.from_pretrained(MODEL_NAME, load_in_8bit=True, device_map="auto")
model = PeftModel.from_pretrained(model, "Sandiago21/llama-7b-hf")
```
### Example of Usage
```python
from transformers import GenerationConfig
PROMPT = """Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.\n\n### Instruction:\nWhich is the capital city of Greece and with which countries does Greece border?\n\n### Input:\nQuestion answering\n\n### Response:\n"""
DEVICE = "cuda"
inputs = tokenizer(
PROMPT,
return_tensors="pt",
)
input_ids = inputs["input_ids"].to(DEVICE)
generation_config = GenerationConfig(
temperature=0.1,
top_p=0.95,
repetition_penalty=1.2,
)
print("Generating Response ... ")
with torch.no_grad():
generation_output = model.generate(
input_ids=input_ids,
generation_config=generation_config,
return_dict_in_generate=True,
output_scores=True,
max_new_tokens=256,
)
for s in generation_output.sequences:
print(tokenizer.decode(s))
```
### Example Output
```python
Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.
### Instruction:
Which is the capital city of Greece and with which countries does Greece border?
### Input:
Question answering
### Response:
Generating...
<unk> Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.
### Instruction:
Which is the capital city of Greece and with which countries does Greece border?
### Input:
Question answering
### Response:
<unk>capital city of Athens and it borders Albania to the northwest, North Macedonia and Bulgaria to the northeast, Turkey to the east, and Libya to the southeast across the Mediterranean Sea.
```
## Training Details
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 50
- num_epochs: 2
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu117
- Datasets 2.12.0
- Tokenizers 0.12.1
### Training Data
The decapoda-research/llama-7b-hf was finetuned on conversations and question answering data
### Training Procedure
The decapoda-research/llama-7b-hf model was further trained and finetuned on question answering and prompts data for 1 epoch (approximately 10 hours of training on a single GPU)
## Model Architecture and Objective
The model is based on decapoda-research/llama-7b-hf model and finetuned adapters on top of the main model on conversations and question answering data.