---
language:
- fa
tags:
- text-generation
- persian
- poetry
- peft
- quantization
- llama
base_model: meta-llama/Llama-3.1-8B-Instruct
finetuned_from: meta-llama/Llama-3.1-8B-Instruct
library_name: transformers
pipeline_tag: text-generation
trust_remote_code: true
special_tokens:
additional_special_tokens:
- '[شروع_شعر]'
- '[پایان_شعر]'
- '[مصرع]'
quantization:
load_in_4bit: true
bnb_4bit_quant_type: nf4
bnb_4bit_compute_dtype: float16
bnb_4bit_use_double_quant: true
license: mit
---
# Model Card for llama_poetry_fa
**Model Name:** DivAIn (دیوان) – A Persian Poetry-Driven Llama-Based Language Model
**Model URL:** [https://huggingface.co/8lianno/llama_poetry_fa](https://huggingface.co/8lianno/llama_poetry_fa)
## Model Summary
`llama_poetry_fa` is a Persian poetry generation model fine-tuned from a Llama 3.1-based checkpoint. It aims to produce stylistically coherent, culturally relevant, and metrically sound verses in response to a user’s prompt. The model focuses on generating output that aligns with classical Persian poetic traditions, such as maintaining consistent rhyme schemes and thematic unity.
## Model Details
### Model Description
- **Developed by:** 8lianno
- **Model type:** Causal Language Model (Llama-based, instruction-tuned)
- **Language:** Persian (fa)
- **License:** MIT (Dedicated Model For Matrix AI)
- **Finetuned from:** `meta-llama/Llama-3.1-8B-Instruct`
This model has been trained to understand Persian poetic structure, using additional tokens `[شروع_شعر]`, `[پایان_شعر]`, and `[مصرع]` to guide and maintain poem formatting. It can respond to various thematic prompts—whether about nature, love, historical figures, or abstract concepts—and produce a poem that attempts to follow Persian poetry norms.
### Model Sources
- **Repository:** [https://huggingface.co/8lianno/llama_poetry_fa](https://huggingface.co/8lianno/llama_poetry_fa)
- **Paper [optional]:** No paper at this time.
## Uses
### Direct Use
This model is intended to generate Persian poetry in response to user prompts. It can be used as a creative writing aid, a tool for literary exploration, or as a source of inspiration for authors, poets, and students studying Persian literature.
**Example:**
- **Prompt:** "درباره بهار شعری بسرایید"
- **Response:**
- عشق چون بهار به جان می افروزد
- هر دم به گلستان دل می روزند
- در شب سیاه، چون شمع می تابد
- هر دم به عشق، جان می روزند
- **Prompt:** "درباره دریا شعری بسرایید"
- **Response:**
- دریا آینهی آسمان است
- در سینهی او، راز نهان است
- هر موج او به ساحل میرسد
- صد قصهی عشق و غم و جان است
- **Prompt:** "درباره عشق بنویسید"
- **Response:**
- امید چراغ راه بود
- در ظلمت شب تاب بود
- هر کس که با امید همره بود
- صاحب گنج بیبند و بست بود
- **Prompt:** "آسمان چرا می گرید؟"
- **Response:**
- آسمان ز عشق میگرید
- از غم خورشید میگرید
- هر کس که به دل عاشق شد
- صد بهار در دل میگرید
### Downstream Use
While primarily designed for poetry generation, the model can be integrated into:
- Persian literature chatbots
- Educational applications for Persian poetry studies
- Cultural or literary heritage projects
### Out-of-Scope Use
The model is not designed for:
- Factual Question-Answering (it may produce poetic but factually incorrect answers)
- Non-Persian text generation
- High-stakes decision making
## Bias, Risks, and Limitations
The model’s content is drawn from its training data and may reflect certain cultural or poetic biases. It might:
- Present thematically repetitive motifs or styles from classical Persian poetry.
- Struggle with modern or colloquial Persian forms.
- Occasionally produce incoherent or repetitive verses.
Users should critically evaluate the generated text, especially if used in public-facing contexts.
### Recommendations
- Review generated poems for cultural appropriateness.
- Verify factual accuracy independently.
- Suggested for Indirect and Poetic answers.
- For sensitive or controversial topics, consider human moderation.
## How to Get Started with the Model
## Poetry Generator Code
```bash
pip install -U transformers>=4.30.0
pip install -U accelerate
pip install bitsandbytes==0.42.0
```
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer, BitsAndBytesConfig
from peft import PeftModel
class PoetryGenerator:
def __init__(self, model_path, token):
self.token = token
self.device = "cuda" if torch.cuda.is_available() else "cpu"
# Configure quantization settings
bnb_config = BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_quant_type="nf4",
bnb_4bit_compute_dtype=torch.float16,
bnb_4bit_use_double_quant=True
)
# Load tokenizer from the base model used during fine-tuning
self.tokenizer = AutoTokenizer.from_pretrained(
"meta-llama/Llama-3.1-8B-Instruct",
token=token,
trust_remote_code=True
)
self.tokenizer.pad_token = self.tokenizer.eos_token
# Add the special tokens that were used during training
special_tokens = {
"additional_special_tokens": [
"[شروع_شعر]",
"[پایان_شعر]",
"[مصرع]"
]
}
self.tokenizer.add_special_tokens(special_tokens)
# Load the base model
base_model = AutoModelForCausalLM.from_pretrained(
"meta-llama/Llama-3.1-8B-Instruct",
token=token,
device_map="auto",
trust_remote_code=True,
torch_dtype=torch.float16,
quantization_config=bnb_config
)
# Resize token embeddings to match tokenizer
base_model.resize_token_embeddings(len(self.tokenizer))
# Load the fine-tuned model from Hugging Face Hub
self.model = PeftModel.from_pretrained(
base_model,
model_path,
token=token,
device_map="auto"
)
self.model.eval()
def generate_poem(self, prompt):
formatted_prompt = f"""سوال: {prompt}
لطفا یک شعر فارسی در پاسخ به این سوال بسرایید که دارای وزن و قافیه مناسب باشد.
شعر:"""
inputs = self.tokenizer(formatted_prompt, return_tensors="pt", padding=True)
inputs = {k: v.to(self.device) for k, v in inputs.items()}
with torch.no_grad():
outputs = self.model.generate(
**inputs,
max_length=512,
num_return_sequences=1,
temperature=0.7,
top_p=0.9,
do_sample=True,
pad_token_id=self.tokenizer.pad_token_id,
eos_token_id=self.tokenizer.eos_token_id
)
return self.tokenizer.decode(outputs[0], skip_special_tokens=True)
def main():
# Use the Hugging Face Hub model path instead of a local path
generator = PoetryGenerator(
model_path="8lianno/llama_poetry_fa",
token=""
)
prompts = [
"درباره بهار شعری بسرایید",
"شعری درباره عشق بنویسید",
"درباره دریا شعری بسرایید"
]
print("=== Persian Poetry Generation ===\n")
for i, prompt in enumerate(prompts, 1):
print(f"\nPrompt {i}: {prompt}")
print("\nGenerated Poetry:")
try:
poem = generator.generate_poem(prompt)
print(poem)
print("\n" + "="*50)
except Exception as e:
print(f"Error generating poem: {str(e)}")
print(f"Error type: {type(e)}")
if __name__ == "__main__":
main()
```