Hackping 2025 Model

This is a test model uploaded for Hackping 2025.

Usage

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("kaipybara/hackping-2025")
model = AutoModelForCausalLM.from_pretrained("kaipybara/hackping-2025")

input_text = "Hello, my name is"
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(**inputs)
print(tokenizer.decode(outputs[0]))
Downloads last month
2
Safetensors
Model size
1.85B params
Tensor type
F32
·
U8
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for kaipybara/hackping-2025

Quantized
(62)
this model

Dataset used to train kaipybara/hackping-2025