YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Airoboros 13b GPT4 1.4 merged with kaiokendev's SuperHOT 8k LoRA.

The code to merge these can be found here. Change information as needed.

NOTE: This requires a monkey patch to work. FlashVenom has, along with kindly quantising this model to 4bit, added the monkeypatch file to their repo. You can access this here.

FROM THE ORIGINAL LORA MODEL CARD: This is a second prototype of SuperHOT, this time with 4K context and no RLHF. In my testing, it can go all the way to 6K without breaking down and I made the change with intention to reach 8K, so I'll assume it will go to 8K although I only trained on 4K sequences.

In order to use the 8K context, you will need to apply the monkeypatch I have added in this repo -- without it, it will not work. The patch is very simple, and you can make the changes yourself:

Increase the max_position_embeddings to 8192 to stretch the sinusoidal
Stretch the frequency steps by a scale of 0.25
Downloads last month
19
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.