ko-en-llama2-13b / README.md
hyunseoki's picture
Update README.md
68f5151
|
raw
history blame
401 Bytes
metadata
language:
  - ko
library_name: transformers
pipeline_tag: text-generation
license: apache-2.0

Input Models input text only.

Output Models generate text only.

Model Architecture
KoT-platypus2-7B is an auto-regressive language model based on the LLaMA2 transformer architecture.

Base Model
Llama-2-13B

Training Dataset
Open dataset wiki and AIhub (English + Korean).