Model Card for agrimi7.5B-dolly
This model is a finetuned (SFT) version of Facbook xglm-7.5B using a machine translated version of the dataset databricks-dolly-15k in Greek language! The purpose is to demonstrate the ability of the specific pretrained model to adapt to instruction following mode by using a relatively small dataset such as the databricks-dolly-15k.
Model Details
Model Description
- Developed by: Andreas Loupasakis
- Model type: Causal Language Model
- Language(s) (NLP): Greek (el)
- License: Apache-2.0
- Finetuned from model: XGLM-7.5B
- Downloads last month
- 11
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.