File size: 807 Bytes
4acea0a d222b67 4acea0a d8a123a 4acea0a d222b67 4acea0a 2d6f6a1 4acea0a d222b67 4acea0a d222b67 4acea0a d222b67 4acea0a d222b67 4acea0a d222b67 4acea0a d222b67 4acea0a d222b67 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 |
---
base_model: HuggingFaceTB/SmolLM2-1.7B
language:
- en
- bg
license: apache-2.0
tags:
- text-generation-inference
- transformers
- llama
- trl
datasets:
- petkopetkov/oasst1_bg
---
# SmolLM2-1.7B-Bulgarian
- **Developed by:** petkopetkov
- **License:** apache-2.0
- **Finetuned from model :** HuggingFaceTB/SmolLM2-1.7B
SmolLM2-1.7B finetuned on OASST1 dataset translated to Bulgarian language.
### Usage
First, install the Transformers library with:
```sh
pip install -U transformers
```
#### Run with the `pipeline` API
```python
import torch
from transformers import pipeline
pipe = pipeline(
"text-generation",
model="petkopetkov/SmolLM2-1.7B-bg",
torch_dtype=torch.bfloat16,
device_map="auto"
)
prompt = "Колко е 2 + 2?"
print(pipe(prompt)[0]['generated_text'])
```
|