About

ape fiction: The ape stands for Algorithmic Pattern Emulation. I am finetuning this to be used in generating fiction.

Finetuned from Mistral Nemo Base using the fullfictions-85kmax dataset.

This uses about the biggest context size entries in the dataset that I've been able to train without OOM errors.

I used unsloth to do the finetuning on a rented GPU (H100) for 2 epochs. Thanks to everybody who made this possible: the unsloth brothers, folks behind KoboldCPP, team behind Mistral Nemo, organizers and volunteers from Gutenberg.org... there's probably more. Thanks everybody.

Downloads last month
6
Safetensors
Model size
12.2B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for leftyfeep/ape-fiction

Quantizations
2 models