Built with Axolotl

The Bucharest series is mostly an experiment. Use Pallas series instead.

An instruct based fine tune of migtissera/Tess-10.7B-v1.5b.

This model is trained on a private dataset + Mihaiii/OpenHermes-2.5-1k-longest-curated, which is a subset of HuggingFaceH4/OpenHermes-2.5-1k-longest, which is a subset of teknium/OpenHermes-2.5.

Prompt Format:

SYSTEM: <ANY SYSTEM CONTEXT>
USER: 
ASSISTANT:

GGUF:

tsunemoto/Bucharest-0.2-GGUF

Downloads last month
45
Safetensors
Model size
10.7B params
Tensor type
BF16
Β·
Inference Examples
Inference API (serverless) has been turned off for this model.

Model tree for Mihaiii/Bucharest-0.2

Finetuned
(2)
this model
Quantizations
2 models

Dataset used to train Mihaiii/Bucharest-0.2

Spaces using Mihaiii/Bucharest-0.2 5