YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Saba-Ethiopia

A fine-tuned LLaMA-3 4-bit model trained for [specific purpose].

Model Details

  • Base Model: LLaMA-3 3B
  • Quantization: 4-bit
  • Use Case: [Describe what the model is fine-tuned for]

Usage

To use this model in your code:

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("modeltrainer1/Saba-Ethiopia", torch_dtype="auto")
tokenizer = AutoTokenizer.from_pretrained("modeltrainer1/Saba-Ethiopia")

inputs = tokenizer("Your input text here", return_tensors="pt")
outputs = model.generate(**inputs)
print(tokenizer.decode(outputs[0]))
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support