Japanese DialoGPT trained with Aozora
(ja) 青空文庫のセリフで学習した日本語のDialoGPT Smallです
(en) Japanese DialoGPT Small trained on Aozora Bunko.
Demo
Demo in this page is not working so well. I recommend you to try it on Hugging Face Spaces Version.
Reference
- Aozora-bunko
- Japanese public domain books.
- I extracted the dialogue part from the books and used it as the training data.
- japanese-gpt2-small
- Novel Japanese GPT2. I used a small model because of the limitation of GPU memory of my desktop PC(with RTX3060x1) 😢.
- I used this model as a pre-trained model.
- DialoGPT: Large-Scale Generative Pre-training for Conversational Response Generation
- Downloads last month
- 25
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.