File size: 501 Bytes
fd0abde 8706c03 96cdbab |
1 2 3 4 5 6 7 8 9 10 11 12 |
---
license: gpl-3.0
datasets:
- databricks/databricks-dolly-15k
language:
- en
pipeline_tag: question-answering
---
Minimal Alpaca-LORA trained with [databricks/databricks-dolly-v2-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k) dataset and based on [OpenLLaMA-3B-600BT](https://huggingface.co/openlm-research/open_llama_3b_600bt_preview).
### I have no powerful GPUs, so I am training it using Google Colab. I am working on Jupyter Notebook to train it and then I release it. |