File size: 1,195 Bytes
2ca70c9 1839a31 2ca70c9 8db0c5a a7ff3d5 2ca70c9 8db0c5a 2ca70c9 c67a72e 8235fef 2ca70c9 07cec19 2ca70c9 fa6a8be 2ca70c9 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 |
---
language:
- en
license: mit
pipeline_tag: text-generation
model-index:
- name: chef-gpt-en
results: []
widget:
- text: 'ingredients>> salmon, lemon; recipe>>'
---
# chef-gpt-en
Test the model [HERE](https://chef-gpt.streamlit.app/).
Fine-tuned GPT-2 for recipe generation. [This](https://www.kaggle.com/datasets/shuyangli94/food-com-recipes-and-user-interactions/data) is the dataset that it's trained on.
## Usage
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
MODEL_ID = "auhide/chef-gpt-en"
tokenizer = AutoTokenizer.from_pretrained(MODEL_ID)
chef_gpt = AutoModelForCausalLM.from_pretrained(MODEL_ID)
ingredients = ", ".join([
"spaghetti",
"tomatoes",
"basel",
"salt",
"chicken",
])
prompt = f"ingredients>> {ingredients}; recipe>>"
tokens = chef_gpt.tokenizer(prompt, return_tensors="pt")
recipe = chef_gpt.generate(**tokens, max_length=124)
print(recipe)
```
Here is a sample result of the prompt:
```bash
ingredients>> spaghetti, tomatoes, basel, salt, chicken; recipe>>cook spaghetti according to package directions
meanwhile, place tomato slices and basel in a large pot with salted water and bring to a boil
reduce heat and
``` |