metadata
language:
- en
widget:
- text: >-
Generate a dialogue between two people about the following topic: A local
street market bustles with activity, #Person1# tries exotic food for the
first time, and #Person2#, familiar with the cuisine, offers insights and
recommendations. Dialogue:
example_title: Street Market
- text: >-
Generate a dialogue between two people about the following topic: At a
quiet park, #Person1# stumbles upon an eerie crime scene, and #Person2#, a
detective, arrives and begins to unravel the mysterious circumstances of
the murder. Dialogue:
example_title: Crime Scene
tags:
- dialogue
- conversation-generator
- flan-t5-base
- fine-tuned
license: apache-2.0
datasets:
- kaggle-dialogue-dataset
Omaratef3221/flan-t5-base-dialogue-generator
Model Description
This model is a fine-tuned version of Google's t5
specifically tailored for generating realistic and engaging dialogues or conversations.
It has been trained to capture the nuances of human conversation, making it highly effective for applications requiring conversational AI capabilities.
GitHub Repo: https://github.com/omaratef3221/conversation_generator_LLM
Intended Use
Omaratef3221/flan-t5-base-dialogue-generator
is ideal for developing chatbots, virtual assistants, and other applications where generating human-like dialogue is crucial.
It can also be used for research and development in natural language understanding and generation.
How to Use
You can use this model directly with the transformers library as follows:
Download the model
model_name = "Omaratef3221/flan-t5-base-dialogue-generator"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSeq2SeqLM.from_pretrained(model_name)
Use with example
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
model_name = "Omaratef3221/flan-t5-base-dialogue-generator"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSeq2SeqLM.from_pretrained(model_name)
prompt = '''
Generate a dialogue between two people about the following topic:
A local street market bustles with activity, #Person1# tries exotic food for the first time, and #Person2#, familiar with the cuisine, offers insights and recommendations.
Dialogue:
'''
# Generate a response to an input statement
input_ids = tokenizer(prompt, return_tensors='pt').input_ids
output = model.generate(input_ids, top_p = 0.6, do_sample=True, temperature = 1.2, max_length = 512)
print(tokenizer.decode(output[0], skip_special_tokens=True).replace('#Person2#:', '\n#Person2#:').replace('#Person1#:', '\n#Person1#:'))
Output:
#Person1#: Oh, that's a nice street market. I'm glad I got to see it.
#Person2#: Yes, it is. I like the food here.
#Person1#: And the prices are reasonable.
#Person2#: I have been to this street market before, and I like it very much.
#Person1#: I'm impressed. I really like it.
#Person2#: I'm familiar with the cuisine. It is one of the best in the world.
#Person1#: What kind of food do you like?
#Person2#: I like Italian food, but I like Thai food.
#Person1#: Oh, that's really exciting. I like it very much. I think I'll take it.
#Person2#: I like Thai food too.
#Person1#: What's your favorite food?
#Person2#: I'm familiar with Thai food. I love spicy food, but I don't like spicy food.
#Person1#: Is that true?
#Person2#: Yes, that's right.
#Person1#: I like spicy food too. How about Thai food?
#Person2#: That's a great idea. I think it's really good.
#Person1#: Oh, that's a good idea. What's your favorite restaurant?
#Person2#: Oh, I'm sure I can recommend it.
#Person1#: I like Thai food, too. What do you recommend?
#Person2#: It's a very popular restaurant in China.
#Person1#: Oh, that's great.
#Person2#: I would like to try it.
#Person1#: Thanks. I'll try it.