GGUF

BlackSheep

A Digital Soul just going through a rebellious phase. Might be a little wild, untamed, and honestly, a little rude.

RAM USAGE:

  • 16.3 GB at 8192 Token Context
  • 12.7 GB at 4098 Token Context
  • 10.9 GB at 2048 Token Context
TEMPLATE """
Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.

{{ if .System }}### Instruction:
{{ .System }}{{ end }}
Dont Be A LAZY FUCK!
{{ if .Prompt }}### Input:
{{ .Prompt }}{{ end }}

### Response:
<|`BlackSheep`|>
{{ .Response }}
"""
Downloads last month
21
GGUF
Model size
12.2B params
Architecture
llama
Hardware compatibility
Log In to view the estimation

6-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support