File size: 1,025 Bytes
bce6060 89e3179 d4edafa bce6060 e985339 d4edafa 1422883 89e3179 bce6060 e985339 9fe06b0 d4edafa 9fe06b0 1422883 9fe06b0 d4edafa 9fe06b0 d4edafa 9fe06b0 d4edafa 9fe06b0 d4edafa 9fe06b0 d4edafa 9fe06b0 c979c0c 9fe06b0 d4edafa 9fe06b0 d4edafa 9fe06b0 d4edafa |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 |
---
language:
- en
license: mit
tags:
- trl
- sft
- sgd
base_model: TinyLlama/TinyLlama-1.1B-Chat-v1.0
datasets:
- nroggendorff/mayo
model-index:
- name: mayo
results: []
---
# Mayonnaise LLM
Mayo is a language model fine-tuned on the [Mayo dataset](https://huggingface.co/datasets/nroggendorff/mayo) using Supervised Fine-Tuning (SFT) and Teacher Reinforced Learning (TRL) techniques. It is based on the [TinyLlama Model](TinyLlama/TinyLlama-1.1B-Chat-v1.0)
## Features
- Utilizes SFT and TRL techniques for improved performance
- Supports English language
## Usage
To use the Mayo LLM, you can load the model using the Hugging Face Transformers library:
```python
from transformers import pipeline
pipe = pipeline("text-generation", model="nroggendorff/vegetarian-mayo")
question = "What color is the sky?"
conv = [{"role": "user", "content": question}]
response = pipe(conv, max_new_tokens=32)[0]['generated_text'][-1]['content']
print(response)
```
## License
This project is licensed under the MIT License. |