|
--- |
|
license: mit |
|
datasets: |
|
- AmanMussa/kazakh-instruction-v1 |
|
language: |
|
- kk |
|
metrics: |
|
- code_eval |
|
pipeline_tag: text-generation |
|
--- |
|
# Model Card for Model ID |
|
|
|
|
|
LLAMA2 model for Kazakh Language |
|
|
|
## Model Details |
|
|
|
This model is from Meta LLAMA 2 parameter-efficient fine-tuning with Kazakh Language. |
|
|
|
### Model Description |
|
|
|
|
|
|
|
- **Developed by:** Mussa Aman |
|
- **Model type:** Question Answering. |
|
- **Language(s) (NLP):** Kazakh |
|
- **License:** MIT |
|
- **Finetuned from model [optional]:** Meta LLAMA 2 |
|
|
|
### Model Sources [optional] |
|
|
|
|
|
|
|
### Out-of-Scope Use |
|
|
|
There are still some mistakes during the inference process. |
|
|
|
## Bias, Risks, and Limitations |
|
|
|
The parameter size could be larger, and the dataset need to be optimized. |
|
|
|
|
|
### Training Data |
|
|
|
|
|
|
|
![image/png](https://cdn-uploads.huggingface.co/production/uploads/64f75f7bd04a890f5347d436/dICYqSD1SZOhhbNBJ_XWz.png) |
|
|
|
|
|
|
|
## Evaluation |
|
|
|
Run summary: |
|
|
|
train/epoch 1.0 |
|
train/global_step 3263 |
|
train/learning_rate 0.0 |
|
train/loss 0.975 |
|
train/total_flos 5.1749473473500774e+17 |
|
train/train_loss 0.38281 |
|
train/train_runtime 13086.8735 |
|
train/train_samples_per_second 3.989 |
|
train/train_steps_per_second 0.249 |
|
|
|
|
|
## Environment |
|
|
|
|
|
- **Hardware Type:** NVIDIA A100 40GB |
|
- **Hours used:** 10 hours |
|
- **Cloud Provider:** Google Colab |
|
|
|
|
|
|
|
|
|
## Citation [optional] |
|
|
|
Citation |
|
|
|
BibTeX: |
|
|
|
@misc{aman_2023, author = {Aman Mussa}, title = {Self-instruct data pairs for Kazakh language}, year = {2023}, howpublished = {\url{https://huggingface.co/datasets/AmanMussa/instructions_kaz_version_1}}, } |
|
|
|
APA: |
|
|
|
Aman, M. (2023). Self-instruct data pairs for Kazakh language. Retrieved from https://huggingface.co/datasets/AmanMussa/instructions_kaz_version_1 |
|
|
|
|
|
|
|
|
|
## Model Card Contact |
|
|
|
Please contact in email: a[email protected] |