File size: 526 Bytes
e58e3f7
 
ac2a4c8
 
 
032003f
ac2a4c8
e58e3f7
 
ac2a4c8
e58e3f7
ac2a4c8
e58e3f7
ac2a4c8
e58e3f7
ac2a4c8
e58e3f7
ac2a4c8
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
---
library_name: transformers
base_model:
- mistralai/Mistral-Nemo-Instruct-2407
datasets:
- nbeerbower/bible-dpo
license: apache-2.0
---

# HolyNemo-12B

[mistralai/Mistral-Nemo-Instruct-2407](https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407) finetuned on [nbeerbower/bible-dpo](https://huggingface.co/datasets/nbeerbower/bible-dpo).

### Method

Finetuned using an A100 on Google Colab for 1 epoch.

[Fine-tune Llama 3 with ORPO](https://mlabonne.github.io/blog/posts/2024-04-19_Fine_tune_Llama_3_with_ORPO.html)