Trained on over 20k instruct generated all by gpt-4 or humans

Dataset features: 1000 long evolved conversations based off LIMA Subsection of correct PRM800k data Subsection of CamelAI's Physics and Chemistry data

The model is trained with Qlora as well as Axolotl.

The model format is Vicuna 1.1:

User: ...
Assistant: ...
Downloads last month
951
Safetensors
Model size
34.4B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for NobodyExistsOnTheInternet/Yi-34B-GiftedConvo-merged

Quantizations
4 models

Dataset used to train NobodyExistsOnTheInternet/Yi-34B-GiftedConvo-merged