Llama-3.2-Instruct-3B-TIES
Overview
The Llama-3.2-Instruct-3B-TIES model is a result of merging three versions of Llama-3.2-3B models using the TIES merging method, facilitated by mergekit. This merge combines a base general-purpose language model with two instruction-tuned models to create a more powerful and versatile model capable of handling diverse tasks.
Model Details
Model Description
- Models Used:
- Merging Tool: Mergekit
- Merge Method: TIES
- Data Type: float16 (FP16) precision
- License: MIT License
Configuration
The following YAML configuration was used to produce this model:
models:
- model: meta-llama/Llama-3.2-3B
# Base model
- model: meta-llama/Llama-3.2-3B-Instruct
parameters:
density: 0.5
weight: 0.5
- model: unsloth/Llama-3.2-3B-Instruct
parameters:
density: 0.5
weight: 0.3
merge_method: ties
base_model: meta-llama/Llama-3.2-3B
parameters:
normalize: true
dtype: float16
- Downloads last month
- 29
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for vhab10/Llama-3.2-Instruct-3B-TIES
Base model
meta-llama/Llama-3.2-3B