File size: 3,659 Bytes
9c2a6bc
 
 
 
 
 
 
 
 
 
 
 
 
 
4d4edc0
9c2a6bc
3fc718a
 
 
 
 
 
 
 
 
 
4d4edc0
9c2a6bc
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4d4edc0
9c2a6bc
 
 
 
853e446
 
3fc718a
 
 
 
 
 
 
 
 
 
9c2a6bc
 
 
 
d9dbb04
9c2a6bc
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
---
license: apache-2.0
tags:
- generated_from_keras_callback
model-index:
- name: Regression_bert_1
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# Regression_bert_1

This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.2128
- Train Mae: 0.2623
- Train Mse: 0.1098
- Train Accuracy: 0.8615
- Train R2-score: 0.8081
- Validation Loss: 0.1657
- Validation Mae: 0.3472
- Validation Mse: 0.1644
- Validation Accuracy: 0.7027
- Validation R2-score: 0.8599
- Epoch: 9

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': 1e-04, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32

### Training results

| Train Loss | Train Mae | Train Mse | Train Accuracy | Train R2-score | Validation Loss | Validation Mae | Validation Mse | Validation Accuracy | Validation R2-score | Epoch |
|:----------:|:---------:|:---------:|:--------------:|:--------------:|:---------------:|:--------------:|:--------------:|:-------------------:|:-------------------:|:-----:|
| 0.6256     | 0.3353    | 0.1579    | 0.7615         | 0.4024         | 0.2916          | 0.4907         | 0.2909         | 0.3243              | 0.7810              | 0     |
| 0.3639     | 0.3290    | 0.1605    | 0.7077         | 0.3874         | 0.3009          | 0.5004         | 0.3003         | 0.3243              | 0.7733              | 1     |
| 0.1835     | 0.2940    | 0.1415    | 0.6615         | 0.7274         | 0.2086          | 0.3992         | 0.2075         | 0.3243              | 0.8417              | 2     |
| 0.1707     | 0.2955    | 0.1462    | 0.5846         | 0.7594         | 0.1872          | 0.3705         | 0.1859         | 0.3243              | 0.8547              | 3     |
| 0.1628     | 0.2740    | 0.1251    | 0.8077         | 0.7588         | 0.1867          | 0.3707         | 0.1854         | 0.4595              | 0.8547              | 4     |
| 0.1541     | 0.2695    | 0.1221    | 0.7769         | 0.7405         | 0.1851          | 0.3696         | 0.1839         | 0.5946              | 0.8549              | 5     |
| 0.2239     | 0.2983    | 0.1388    | 0.7154         | 0.7428         | 0.2561          | 0.4564         | 0.2552         | 0.3243              | 0.7987              | 6     |
| 0.1998     | 0.2815    | 0.1295    | 0.7538         | 0.7537         | 0.1979          | 0.3872         | 0.1968         | 0.3514              | 0.8473              | 7     |
| 0.1682     | 0.2743    | 0.1260    | 0.7692         | 0.7532         | 0.1515          | 0.3350         | 0.1500         | 0.9730              | 0.8691              | 8     |
| 0.2128     | 0.2623    | 0.1098    | 0.8615         | 0.8081         | 0.1657          | 0.3472         | 0.1644         | 0.7027              | 0.8599              | 9     |


### Framework versions

- Transformers 4.27.3
- TensorFlow 2.11.0
- Datasets 2.10.1
- Tokenizers 0.13.2