AI-Engine's picture
Update README.md
a8f0a03 verified
|
raw
history blame
1.27 kB
metadata
license: llama3.1

GGUF llama.cpp quantized version of:

Update(24/07/27): Latest fixes to use the full 128k context window are included in -ropefix versions. Requirement to run them and used version: b3472

Update: Use the -imatrix versions (they use imatrix and the bpe-llama tokenizer which should theoretically improve the output)

Recommended Prompt Format (Llama3)

<|begin_of_text|><|start_header_id|>system<|end_header_id|>

Provide some context and/or instructions to the model.<|eot_id|><|start_header_id|>user<|end_header_id|>

The user’s message goes here<|eot_id|><|start_header_id|>assistant<|end_header_id|>

AI message goes here<|eot_id|>

Quant Version: b3445