Model

  • base_model : yanolja/KoSOLAR-10.7B-v0.2
  • training objective: instruction Tuning

Dataset

공개 데이터 수집

  • Deduplicating Training Data Makes Language Models Better 알고리즘 활용
  • instruction version 1.4

Code

from transformers import AutoModelForCausalLM, AutoTokenizer
import torch

model_name = "jjingyeom/freeze_KoSoLAR-10.7B-v0.2_1.4_dedup_1"
model = AutoModelForCausalLM.from_pretrained(
        model_name,
)
tokenizer = AutoTokenizer.from_pretrained(model_name)
Downloads last month
4,693
Safetensors
Model size
10.8B params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for jingyeom/KoSoLAR-10.7B-v0.2_1.4_dedup_1

Adapters
1 model
Quantizations
1 model