metadata
base_model: unsloth/Meta-Llama-3.2-1B-Instruct
language:
- es
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- gguf
- q4_k_m
- 4bit
- sharegpt
- pretaining
- finetuning
- Q5_K_M
- Q8_0
- uss
- Perú
- Lambayeque
- Chiclayo
datasets:
- ussipan/sipangpt
pipeline_tag: text2text-generation
new_version: ussipan/SipanGPT-0.3-Llama-3.2-1B-GGUF
SipánGPT 0.2 Llama 3.2 1B GGUF
- Modelo pre-entrenado para responder preguntas de la Universidad Señor de Sipán de Lambayeque, Perú.
- Pre-trained model to answer questions from the Señor de Sipán University of Lambayeque, Peru.
Testing the model
- Debido a la cantidad de conversaciones con las que fue entrenado (5400 conversaciones), el modelo genera bastantes alucinaciones.
- Due to the number of conversations with which it was trained (5400 conversations), the model generates quite a few hallucinations.
Uploaded model
- Developed by: jhangmez
- License: apache-2.0
- Finetuned from model : unsloth/Meta-Llama-3.2-1B-Instruct
This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.
SipánGPT 0.2 Llama 3.2 1B GGUF
Hecho con ❤️ por Jhan Gómez P.