General discussion/chatting.
#2
by
Lewdiculous
- opened
Hey! Love me some context size.
Managing a stable/decent cohesion at, say, 12K?
Testing now :3
Also when using universal light this model likes to describe everything, its own actions to the t. but it keeps the messages short? not sure which model would cause that but i like it
Results.
It's sensitive as of now (easy to make go insane), it probably needs some form of fine-tuning. I'll see what i can do as i like the way it speaks
but it keeps the messages short
That's nice.
it works perfect with alpaca
As long as it works with at least a prompt format it should be fine.