Usage
from transformers import AutoTokenizer, AutoModelForSequenceClassification, pipeline
tokenizer = AutoTokenizer.from_pretrained("zeusfsx/title-instruction")
model = AutoModelForSequenceClassification.from_pretrained("zeusfsx/title-instruction")
classification = pipeline("text-classification", model=model, tokenizer=tokenizer, device="mps") # for mac I used mps
- Downloads last month
- 104
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.