Model Card for Model ID
This model has been created with Argilla, trained with Transformers.
This is a sample model finetuned from prajjwal1/bert-tiny.
Model training
Training the model using the ArgillaTrainer
:
# Load the dataset:
dataset = FeedbackDataset.from_huggingface("argilla/emotion")
# Create the training task:
task = TrainingTask.for_text_classification(text=dataset.field_by_name("text"), label=dataset.question_by_name("label"))
# Create the ArgillaTrainer:
trainer = ArgillaTrainer(
dataset=dataset,
task=task,
framework="transformers",
model="prajjwal1/bert-tiny",
)
trainer.update_config({
"logging_steps": 1,
"num_train_epochs": 1,
"output_dir": "tmp"
})
trainer.train(output_dir="None")
You can test the type of predictions of this model like so:
trainer.predict("This is awesome!")
Model Details
Model Description
Model trained with ArgillaTrainer
for demo purposes
- Developed by: [More Information Needed]
- Shared by [optional]: [More Information Needed]
- Model type: Finetuned version of prajjwal1/bert-tiny for demo purposes
- Language(s) (NLP): ['en']
- License: apache-2.0
- Finetuned from model [optional]: prajjwal1/bert-tiny
Model Sources [optional]
- Repository: N/A
Technical Specifications [optional]
Framework Versions
- Python: 3.10.7
- Argilla: 1.19.0-dev
- Downloads last month
- 12
Inference API (serverless) does not yet support Transformers models for this pipeline type.