|
--- |
|
base_model: Monero/Manticore-13b-Chat-Pyg-Guanaco |
|
tags: |
|
- manticore |
|
- llama-cpp |
|
- llama |
|
--- |
|
|
|
<!DOCTYPE html> |
|
<style> |
|
h1 { |
|
color: #FF0000; |
|
text-decoration: none; |
|
} |
|
</style> |
|
<html lang="en"> |
|
<head> |
|
</head> |
|
<body> |
|
<h1>!!! Archive of LLaMa-1-13B Model !!!</h1> |
|
</body> |
|
</html> |
|
|
|
# May 27, 2023 - Monero/Manticore-13b-Chat-Pyg-Guanaco |
|
|
|
v000000 |
|
|
|
This model was converted to GGUF format from [`Monero/Manticore-13b-Chat-Pyg-Guanaco`](https://huggingface.co/Monero/Manticore-13b-Chat-Pyg-Guanaco) using llama.cpp |
|
Refer to the [original model card](https://huggingface.co/Monero/Manticore-13b-Chat-Pyg-Guanaco) for more details on the model. |
|
|
|
* [Quants in repo:] static Q5_K_M, static Q6_K, static Q8_0 |
|
|
|
|
|
Manticore-13b-Chat-Pyg with the Guanaco 13b qLoRa from TimDettmers applied |