# Trying to make AI conversation for this fine-tuning of this model. here we use the **[dataset](abhijitgayen/cogo_chat)** # How to use this Model ```python from transformers import AutoModelForSeq2SeqLM, AutoTokenizer model_id= "abhijitgayen/cogo-blenderbot" tokenizer = AutoTokenizer.from_pretrained(model_id) model = AutoModelForSeq2SeqLM.from_pretrained(model_id) UTTERANCE = "help me to book fcl" print("Human: ", UTTERANCE) inputs = tokenizer([UTTERANCE], return_tensors="pt") reply_ids = model.generate(**inputs) print("Bot: ", tokenizer.batch_decode(reply_ids, skip_special_tokens=True)[0]) ``` # Out Put Response is good but it is take time to give response..As chatbot a real time application. if it takes more than 20 sec , it is meaningless.