This model was trained for russian jokes generation task as part of the "Advanced NLP" HSE University course.
You can try to generate some kind of joke by specifying some prefix for the model as an input.
Main features of the model: GQA, Alibi PE, BPE tokenizer, 79.4M params.
The quality of jokes is questionable but sometimes it can generate something funny.
For example: Купил мужик шляпу и говорит: — Вы сегодня моя бабушка
You can try to generate your own joke by running this code
import torch
device = torch.device("cuda")
tokenizer = ByteLevelBPETokenizer.from_pretrained("just-ne-just/llm-course-hw1")
model = TransformerForCausalLM.from_pretrained("just-ne-just/llm-course-hw1")
model = model.to(device)
model = model.eval()
text = "Купил мужик шляпу"
input_ids = torch.tensor(tokenizer.encode(text), device=device)
model_output = model.generate(
input_ids[None, :], max_new_tokens=50, eos_token_id=tokenizer.eos_token_id, do_sample=True, top_k=10
)
tokenizer.decode(model_output[0].tolist())
- Downloads last month
- 58
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.