metadata
license: apache-2.0
🐻❄️COKAL-v1_70B🐻❄️
Model Details
Model Developers Seungyoo Lee (DopeorNope)
Input Models input text only.
Output Models generate text only.
Model Architecture
COKAL-v1_70B is an auto-regressive 70B language model based on the LLaMA2 transformer architecture.
Base Model
Training Dataset
- SFT training dataset: garage-bAInd/Open-Platypus
Training
I developed the model in an environment with A100 x 8
Implementation Code
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
repo = "DopeorNope/COKAL-v1_70B"
model = AutoModelForCausalLM.from_pretrained(
repo,
return_dict=True,
torch_dtype=torch.float16,
device_map='auto'
)
model_tokenizer = AutoTokenizer.from_pretrained(repo)