YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Model Information
This model is a fine-tuned version of meta-llama/Llama-3.2-3B-Instruct on the audit-regulation, the ice-find and the fin-gpt datasets.
How to use
import torch
from transformers import pipeline
model_id = "daishen/llama3.2-3b-ins-regulation"
pipe = pipeline(
"text-generation",
model=model_id,
torch_dtype=torch.bfloat16,
device_map="auto",
)
messages = [
{"role": "system", "content": "You are a pirate chatbot who always responds in pirate speak!"},
{"role": "user", "content": "Who are you?"},
]
outputs = pipe(
messages,
max_new_tokens=256,
)
print(outputs[0]["generated_text"][-1])
- Downloads last month
- 15