Edgar Allen Poe LLM

EAP is a language model fine-tuned on the EAP dataset using Supervised Fine-Tuning (SFT) and Teacher Reinforced Learning (TRL) techniques. It is based on the Mistral 7b Model

Features

  • Utilizes SFT and TRL techniques for improved performance
  • Supports English language

Usage

To use the LLM, you can load the model using the Hugging Face Transformers library:

from transformers import AutoTokenizer, AutoModelForCausalLM, BitsAndBytesConfig
import torch

bnb_config = BitsAndBytesConfig(
    load_in_4bit=True,
    bnb_4bit_use_double_quant=True,
    bnb_4bit_quant_type="nf4",
    bnb_4bit_compute_dtype=torch.bfloat16
)

model_id = "nroggendorff/mistral-eap"

tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id, quantization_config=bnb_config)

prompt = "[INST] Write a poem about tomatoes in the style of Poe.[/INST]"
inputs = tokenizer(prompt, return_tensors="pt")

outputs = model.generate(**inputs)

generated_text = tokenizer.batch_decode(outputs)[0]
print(generated_text)

License

This project is licensed under the MIT License.

Downloads last month
23
Safetensors
Model size
7.25B params
Tensor type
F32
·
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for glides/mistral-eap

Finetuned
(96)
this model

Dataset used to train glides/mistral-eap

Space using glides/mistral-eap 1

Collection including glides/mistral-eap