Maestro-10B

Model banner

Model Information

Maestro-10B

suayptalha/Maestro-10B arcee-ai/Virtuoso-Lite DeepSeek-V3 10b Parameters

Base Model

Maestro-10B is a 10 billion parameter model fine-tuned from Virtuoso-Lite, a next-generation language model developed by arcee-ai. Virtuoso-Lite itself is based on the Llama-3 architecture, distilled from Deepseek-v3 using approximately 1.1 billion tokens/logits. This distillation process allows Virtuoso-Lite to achieve robust performance with a smaller parameter count, excelling in reasoning, code generation, and mathematical problem-solving. Maestro-10B inherits these strengths from its base model, Virtuoso-Lite, and further enhances them through fine-tuning on the OpenOrca dataset. This combination of a distilled base model and targeted fine-tuning makes Maestro-10B a powerful and efficient language model.

Loss Graph

Model banner
Downloads last month
232
Safetensors
Model size
10.3B params
Tensor type
FP16
Β·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for suayptalha/Maestro-10B

Finetuned
(4)
this model
Quantizations
3 models

Dataset used to train suayptalha/Maestro-10B

Spaces using suayptalha/Maestro-10B 4

Collection including suayptalha/Maestro-10B