Can you increase the context length?

#3
by starmanj - opened

My discussions crash fairly quickly with error out of context memory. The model thinks it can use 4096 tokens of context.

It can be increased up to 16k and possibly further

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment