File size: 767 Bytes
ba11093 d9ae715 ba11093 698ef23 ba11093 698ef23 ba11093 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 |
---
library_name: transformers
language:
- ru
- en
pipeline_tag: text-generation
license: other
license_name: apache-2.0
license_link: https://huggingface.co/MTSAIR/Cotype-Nano-GGUF/blob/main/Apache%20License%20MTS%20AI.docx
---
# Cotype Nano GGUF🤖
Cotype-Nano-GGUF– это легковесная LLM, которая может запускаться даже на мобильных девайсах
Cotype-Nano-GGUF is a lightweight LLM that can run even on mobile devices
### Inference
```python
from llama_cpp import Llama
llm = Llama.from_pretrained(
repo_id="MTSAIR/Cotype-Nano-GGUF",
filename="cotype_nano_8bit.gguf",
)
llm.create_chat_completion(
messages = [
{
"role": "user",
"content": "What is the capital of France?"
}
]
)
``` |