File size: 944 Bytes
2310b4f 57229b7 2310b4f 57229b7 2310b4f c4944cd 2310b4f 57229b7 2310b4f 2060ee1 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
---
library_name: transformers
tags:
- trl
- sft
---
### Model Description
<!-- Provide a longer summary of what this model is. -->
This model is a fine-tuned version of [microsoft/Phi-3-mini-4k-instruct](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct) on [m-a-p/Code-Feedback](https://huggingface.co/datasets/m-a-p/Code-Feedback) in order to answer questions related to programming better. Trained by making small modifications on [sample_finetune.py](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct/blob/main/sample_finetune.py) provided by Microsoft.
- **Developed by:** [Can Deniz Koçak](https://www.linkedin.com/in/candenizkocak/)
- **Finetuned from model:** [microsoft/Phi-3-mini-4k-instruct](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct)
### Fine-tuning Data
[m-a-p/Code-Feedback](https://huggingface.co/datasets/m-a-p/Code-Feedback)
### Training Procedure
Trained on a single A100 on Google Colab. |