YAML Metadata Error: "widget" must be an array

This model was created using GPT-2 as a base, and fine-tuned upon a dataset of elementary school problems requiring logic and reasoning. Requires Pytorch

How to use to infer text


from transformers import AutoTokenizer, AutoModelForCasualLM
import torch

type = "gpt2-large"
tokenizer = AutoTokenizer.from_pretrained(type)
model = AutoModelForCausalLM.from_pretrained(type)

model_path = '../model.pt'

model = torch.load(model_path)

your_text = "A courier received 50 packages yesterday and twice as many today.  All of these should be delivered tomorrow. How many packages should be delivered tomorrow?"
encoded_text = self.tokenizer.encode(your_text, return_tensors='pt')
outputs = model.generate(encoded_text, max_length=64, do_sample=True, temperature=0.5, top_p=1)
outputs = [tokenizer.decode(output) for output in outputs]
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.