DavidAU/Llama-3.2-8X3B-MOE-Dark-Champion-Instruct-uncensored-abliterated-18.4B-GGUF Text Generation • Updated Feb 12 • 103k • 159
Long Context - 16k,32k,64k,128k,200k,256k,512k,1000k Collection Q6/Q8 models here. Mixtrals/Mistral (and merges) generally have 32k context (not listed here) . Please see org model card for usage / templates. • 71 items • Updated 4 days ago • 12