Chinese-LLaMA-2-7B-64K

This repository contains Chinese-LLaMA-2-7B-64K, which is tuned on Chinese-LLaMA-2-7B with YaRN method.

For LoRA-only model, please see: https://huggingface.co/hfl/chinese-llama-2-lora-7b-64k

Please refer to https://github.com/ymcui/Chinese-LLaMA-Alpaca-2/ for more details.

Downloads last month
15
Safetensors
Model size
6.93B params
Tensor type
F32
·
FP16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for hfl/chinese-llama-2-7b-64k

Quantizations
1 model