Model Overview

Labess-7b-chat is an open model instruction-tuned for Tunisian Derja, it's a continual pre-training version of jais-adapted-7b-chat with tunisian_Derja_Dataset

Uploaded model

  • Developed by: Linagora
  • License: apache-2.0
  • Finetuned from model : inceptionai/jais-adapted-7b-chat

Usage

Below we share some code snippets on how to get quickly started with running the model. First, install the Transformers library with:

pip install transformers

Usage

import torch
from transformers import pipeline

pipe = pipeline(
    "text-generation",
    model="linagora/Labess-7b-chat-16bit",
    model_kwargs={"torch_dtype": torch.bfloat16},    
    device="cuda" # replace with "mps" to run on a Mac device
)

messages = [
    {"role": "user", "content": 'وين تجي تونس؟'},
]

outputs = pipe(messages, max_new_tokens=64, do_sample=True, temperature=0.2)
assistant_response = outputs[0]["generated_text"][-1]["content"].strip()
print(assistant_response)
- Response:تونس هي بلاد في شمال إفريقيا هي بلاد جميلة برشة ومعروفة في العالم الكل هي بلاد فيها مناظر طبيعية

Citations

When using this model Labess-7b-chat, please cite:

@model{linagora2025LLM-tn,
  author = {Wajdi Ghezaiel and Jean-Pierre Lorré},
  title = {Labess-7b-chat:Tunisian Derja LLM},
  year = {2025},
  month = {January},  
  url = {https://huggingface.co/datasets/Wajdi1976/Labess-7b-chat}
}

Downloads last month
462
Safetensors
Model size
7B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for linagora/Labess-7b-chat-16bit

Finetuned
(15)
this model

Dataset used to train linagora/Labess-7b-chat-16bit