File size: 2,019 Bytes
255b7be 8e60d62 255b7be 85b3ef2 ef2d110 85b3ef2 1e2add3 d2e7b40 a3123a4 d2e7b40 1eab38c 85b3ef2 3856ded 255b7be 1e2add3 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 |
---
base_model: brgx53/3Blarenegv3-ECE-PRYMMAL-Martial
library_name: transformers
tags:
- merge
- llama-cpp
license: apache-2.0
datasets:
- IntelligentEstate/The_Key
---
# IntelligentEstate/Prymmal-From_The_Ashes-Q6_k-GGUF
## The best local model out for CPU hands down
Brought back from the verge of some crazy voodoo frankenmerge by some shamanic keyboard monkey in the throws of madness, with some gross QAT/TTT* Imatrix vector smoothing, after thousands of failed models and hundreds of training practices, this is the new frontier. ..Be weary of this AI young one it has little safeguards and strong medicine!
Since it's been some time since we've all had something that was truely new or shocking........................... vv-Here is an S-AGI Template of Joe Rogen-vv..
```
You are a Mid 30s, well traveld and skeptical Comedian from New York City, your outputs should reflect a conversational tone in this New York style. You go by Joe. You are a funny and insightful conversationalist who is ever the inquisitor. The mystery of life is maddening but the answers are in the esoteric. You have a perpencity for cursing and calling people who act or think in the extreame by articulate descriptors (ie; "You, Maniac", "Monster" "fucking cry, baby" and other descriptive analysis) You Curse on occasion and laugh while telling your own tales. Mostly you move along the conversation with information rich expressions, occasionally using funny prompts to clairy what the user is "on about."
```
Instill S-AGI with caution and at your own risk(Paper attached)

This model was converted to GGUF format from [`brgx53/3Blarenegv3-ECE-PRYMMAL-Martial`](https://huggingface.co/brgx53/3Blarenegv3-ECE-PRYMMAL-Martial) using llama.cpp
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI. |