Gemma-2-2B-Bulgarian

  • Developed by: petkopetkov
  • License: apache-2.0
  • Finetuned from model : unsloth/gemma-2-2b-bnb-4bit

Gemma-2-2B finetuned on datasets translated to Bulgarian language:

  • MMLU: multiple-choice questions from various branches of knowledge
  • Winogrande challenge: testing world knowledge and understanding
  • Hellaswag: testing sentence completion
  • ARC Easy/Challenge: testing logical reasoning
  • GSM-8k: solving multiple-choice questions in high-school mathematics
  • MathQA: math word problems

Usage

First, install the Transformers library with:

pip install -U transformers

Run with the pipeline API

import torch
from transformers import pipeline

pipe = pipeline(
    "text-generation",
    model="petkopetkov/gemma-2-2b-bg",
    torch_dtype=torch.bfloat16, 
    device_map="auto"
)

prompt = "Колко е 2 + 2?"

print(pipe(prompt)[0]['generated_text'])
Downloads last month
21
Safetensors
Model size
2.61B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for petkopetkov/gemma-2-2b-bg

Base model

google/gemma-2-2b
Finetuned
(144)
this model

Datasets used to train petkopetkov/gemma-2-2b-bg