Add LocalAI configuration
Browse filesthis allows to run the model directly with LocalAI by pointing to the URL of the model file, for example:
local-ai run huggingface://fakezeta/Llama3-Aloe-8B-Alpha-ov-int8/model.yaml
- model.yaml +11 -0
model.yaml
ADDED
@@ -0,0 +1,11 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
name: llama3-aloe
|
2 |
+
backend: transformers
|
3 |
+
parameters:
|
4 |
+
model: fakezeta/Llama3-Aloe-8B-Alpha-ov-int8
|
5 |
+
context_size: 8192
|
6 |
+
type: OVModelForCausalLM
|
7 |
+
template:
|
8 |
+
use_tokenizer_template: true
|
9 |
+
stopwords:
|
10 |
+
- "<|eot_id|>"
|
11 |
+
- "<|end_of_text|>"
|