Composite Qwen2.5-0.5B Model

This is a composite model created by combining layers from different Qwen2.5-0.5B variants.

Usage

from transformers import AutoConfig, AutoModelForCausalLM, AutoTokenizer

config = AutoConfig.from_pretrained("ant031525-01")
model = AutoModelForCausalLM.from_pretrained("ant031525-01")
tokenizer = AutoTokenizer.from_pretrained("ant031525-01")

Base Models

This model is comprised of layers from the following models:

  • Qwen/Qwen2.5-0.5B
  • Qwen/Qwen2.5-0.5B-Instruct
  • unsloth/Qwen2.5-0.5B
  • cognitivecomputations/Dolphin3.0-Qwen2.5-0.5B
  • artificialguybr/Qwen2.5-0.5B-OpenHermes2.5
Downloads last month
4
Safetensors
Model size
494M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Evaluation results