hubert_large_emodb / README.md
Bagus's picture
Update README.md
1f206ce verified
metadata
license: apache-2.0
base_model: facebook/hubert-large-ll60k
tags:
  - generated_from_trainer
model-index:
  - name: hubert_large_emodb
    results: []

hubert_large_emodb

This model is a fine-tuned version of facebook/hubert-large-ll60k on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9789
  • Uar: 0.8800
  • Acc: 0.8897

For the test Set:

  • UAR: 0.805
  • 0.845

FI scores:
labels: ['anger', 'happiness', 'sadness', 'neutral'] Result per class (F1 score): [0.84, 0.364, 1.0, 1.0]

Model description

This model is to predict one of four emotion categories: 'anger', 'happiness', 'sadness', 'neutral'

Intended uses & limitations

How to use:

from transformers import pipeline
pipe = pipeline("audio-classification", model="Bagus/hubert_large_emodb")
pipe('file.wav')

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10

### Training results

| Training Loss | Epoch | Step | Validation Loss | Uar    | Acc    |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|
| No log        | 0.15  | 1    | 1.3865          | 0.25   | 0.1985 |
| No log        | 0.31  | 2    | 1.3794          | 0.25   | 0.1985 |
| No log        | 0.46  | 3    | 1.3745          | 0.25   | 0.1985 |
| No log        | 0.62  | 4    | 1.3684          | 0.3227 | 0.3162 |
| No log        | 0.77  | 5    | 1.3592          | 0.4722 | 0.5809 |
| No log        | 0.92  | 6    | 1.3487          | 0.3981 | 0.5221 |
| 1.4402        | 1.08  | 7    | 1.3406          | 0.4444 | 0.5588 |
| 1.4402        | 1.23  | 8    | 1.3359          | 0.5278 | 0.625  |
| 1.4402        | 1.38  | 9    | 1.3305          | 0.5418 | 0.6324 |
| 1.4402        | 1.54  | 10   | 1.3228          | 0.5790 | 0.6544 |
| 1.4402        | 1.69  | 11   | 1.3078          | 0.6392 | 0.6985 |
| 1.4402        | 1.85  | 12   | 1.2832          | 0.6577 | 0.7132 |
| 1.4402        | 2.0   | 13   | 1.2445          | 0.6670 | 0.7206 |
| 1.0783        | 2.15  | 14   | 1.2087          | 0.6715 | 0.7279 |
| 1.0783        | 2.31  | 15   | 1.1857          | 0.6579 | 0.7059 |
| 1.0783        | 2.46  | 16   | 1.1746          | 0.6488 | 0.6912 |
| 1.0783        | 2.62  | 17   | 1.1666          | 0.6397 | 0.6765 |
| 1.0783        | 2.77  | 18   | 1.1393          | 0.6443 | 0.6838 |
| 1.0783        | 2.92  | 19   | 1.1079          | 0.6810 | 0.7279 |
| 0.9255        | 3.08  | 20   | 1.0908          | 0.7271 | 0.7721 |
| 0.9255        | 3.23  | 21   | 1.0786          | 0.7131 | 0.7647 |
| 0.9255        | 3.38  | 22   | 1.0697          | 0.6574 | 0.7279 |
| 0.9255        | 3.54  | 23   | 1.0711          | 0.6111 | 0.6912 |
| 0.9255        | 3.69  | 24   | 1.0651          | 0.6389 | 0.7132 |
| 0.9255        | 3.85  | 25   | 1.0596          | 0.6481 | 0.7206 |
| 0.9255        | 4.0   | 26   | 1.0566          | 0.6667 | 0.7353 |
| 0.6547        | 4.15  | 27   | 1.0562          | 0.6667 | 0.7353 |
| 0.6547        | 4.31  | 28   | 1.0553          | 0.7222 | 0.7794 |
| 0.6547        | 4.46  | 29   | 1.0549          | 0.7316 | 0.7794 |
| 0.6547        | 4.62  | 30   | 1.0546          | 0.7456 | 0.7868 |
| 0.6547        | 4.77  | 31   | 1.0516          | 0.7549 | 0.7941 |
| 0.6547        | 4.92  | 32   | 1.0428          | 0.7456 | 0.7868 |
| 0.7058        | 5.08  | 33   | 1.0312          | 0.7502 | 0.7941 |
| 0.7058        | 5.23  | 34   | 1.0235          | 0.7594 | 0.8015 |
| 0.7058        | 5.38  | 35   | 1.0143          | 0.7732 | 0.8162 |
| 0.7058        | 5.54  | 36   | 1.0079          | 0.7963 | 0.8382 |
| 0.7058        | 5.69  | 37   | 1.0049          | 0.7963 | 0.8382 |
| 0.7058        | 5.85  | 38   | 1.0051          | 0.7778 | 0.8235 |
| 0.7058        | 6.0   | 39   | 1.0066          | 0.7593 | 0.8088 |
| 0.4919        | 6.15  | 40   | 1.0119          | 0.7407 | 0.7941 |
| 0.4919        | 6.31  | 41   | 1.0172          | 0.7222 | 0.7794 |
| 0.4919        | 6.46  | 42   | 1.0191          | 0.7130 | 0.7721 |
| 0.4919        | 6.62  | 43   | 1.0175          | 0.7130 | 0.7721 |
| 0.4919        | 6.77  | 44   | 1.0144          | 0.7222 | 0.7794 |
| 0.4919        | 6.92  | 45   | 1.0094          | 0.7222 | 0.7794 |
| 0.5048        | 7.08  | 46   | 1.0050          | 0.7593 | 0.8088 |
| 0.5048        | 7.23  | 47   | 0.9984          | 0.7870 | 0.8309 |
| 0.5048        | 7.38  | 48   | 0.9948          | 0.7778 | 0.8235 |
| 0.5048        | 7.54  | 49   | 0.9917          | 0.7825 | 0.8235 |
| 0.5048        | 7.69  | 50   | 0.9884          | 0.8195 | 0.8529 |
| 0.5048        | 7.85  | 51   | 0.9846          | 0.8242 | 0.8529 |
| 0.5048        | 8.0   | 52   | 0.9827          | 0.8152 | 0.8382 |
| 0.4133        | 8.15  | 53   | 0.9816          | 0.8337 | 0.8529 |
| 0.4133        | 8.31  | 54   | 0.9812          | 0.8522 | 0.8676 |
| 0.4133        | 8.46  | 55   | 0.9810          | 0.8522 | 0.8676 |
| 0.4133        | 8.62  | 56   | 0.9810          | 0.8707 | 0.8824 |
| 0.4133        | 8.77  | 57   | 0.9806          | 0.8800 | 0.8897 |
| 0.4133        | 8.92  | 58   | 0.9796          | 0.8800 | 0.8897 |
| 0.4717        | 9.08  | 59   | 0.9793          | 0.8800 | 0.8897 |
| 0.4717        | 9.23  | 60   | 0.9789          | 0.8800 | 0.8897 |


### Framework versions

- Transformers 4.32.0
- Pytorch 2.3.0+cu121
- Datasets 2.19.1
- Tokenizers 0.13.3