Asteriks's picture
update the dataset in the readme
d63018a
metadata
license: apache-2.0
base_model: distilbert-base-cased
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
model-index:
  - name: distilbert-cased-reviews-v1
    results: []
widget:
  - text: >-
      Red Hot Chili Peppers on vinyl has been a disappointing experience.. I had
      to return both “By The Way” and “Stadium Arcadium” because there were
      skips on almost all of it.. Kind of made it seem like the record label
      just went cheap, which is a disservice to anyone that actually listens to
      their vinyl...This “Greatest Hits” compilation did not have the same
      problems as the other two I bought. It sounded as it should have, and
      there were no skips.
datasets:
  - yyu/amazon-attrprompt
language:
  - en

distilbert-cased-reviews-v1

This model is a fine-tuned version of distilbert-base-cased on yyu/amazon-attrprompt dataset. It achieves the following results on the evaluation set:

  • Loss: 1.9022
  • Accuracy: {'accuracy': 0.7478260869565218}
  • F1 Score: {'f1': 0.7350319489971969}

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 3.0

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Score
0.2514 1.0 1728 1.8422 {'accuracy': 0.7347826086956522} {'f1': 0.7217427746565059}
0.2709 2.0 3456 1.8755 {'accuracy': 0.7347826086956522} {'f1': 0.7200496580345588}
0.0912 3.0 5184 1.9022 {'accuracy': 0.7478260869565218} {'f1': 0.7350319489971969}

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.0
  • Tokenizers 0.15.0