File size: 3,643 Bytes
9c2a6bc
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
---
license: apache-2.0
tags:
- generated_from_keras_callback
model-index:
- name: Regression_bert_1
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# Regression_bert_1

This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.3594
- Train Mae: 0.2822
- Train Mse: 0.1206
- Train R2-score: 0.6163
- Train Accuracy: 0.5308
- Validation Loss: 0.3503
- Validation Mae: 0.3488
- Validation Mse: 0.1574
- Validation R2-score: 0.8718
- Validation Accuracy: 0.2703
- Epoch: 9

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': 2e-05, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32

### Training results

| Train Loss | Train Mae | Train Mse | Train R2-score | Train Accuracy | Validation Loss | Validation Mae | Validation Mse | Validation R2-score | Validation Accuracy | Epoch |
|:----------:|:---------:|:---------:|:--------------:|:--------------:|:---------------:|:--------------:|:--------------:|:-------------------:|:-------------------:|:-----:|
| 0.4941     | 0.2941    | 0.1183    | -0.5444        | 0.5769         | 0.3126          | 0.3108         | 0.1099         | 0.8865              | 0.2703              | 0     |
| 0.4660     | 0.3256    | 0.1546    | 0.0002         | 0.5231         | 0.3682          | 0.3669         | 0.1835         | 0.8572              | 0.2703              | 1     |
| 0.4110     | 0.3178    | 0.1552    | 0.6834         | 0.5            | 0.4381          | 0.4369         | 0.2390         | 0.8207              | 0.2703              | 2     |
| 0.3886     | 0.3112    | 0.1560    | 0.7184         | 0.5231         | 0.3566          | 0.3552         | 0.1672         | 0.8661              | 0.2703              | 3     |
| 0.4055     | 0.2890    | 0.1248    | 0.7655         | 0.6077         | 0.4364          | 0.4353         | 0.2376         | 0.8218              | 0.2703              | 4     |
| 0.3955     | 0.2930    | 0.1272    | 0.7685         | 0.5538         | 0.3868          | 0.3855         | 0.1971         | 0.8489              | 0.2703              | 5     |
| 0.3949     | 0.3003    | 0.1386    | 0.3857         | 0.5154         | 0.3614          | 0.3600         | 0.1751         | 0.8620              | 0.2703              | 6     |
| 0.3390     | 0.2874    | 0.1306    | 0.7121         | 0.5231         | 0.3766          | 0.3753         | 0.1894         | 0.8542              | 0.2703              | 7     |
| 0.3556     | 0.2775    | 0.1190    | 0.7890         | 0.5231         | 0.3561          | 0.3547         | 0.1664         | 0.8667              | 0.2703              | 8     |
| 0.3594     | 0.2822    | 0.1206    | 0.6163         | 0.5308         | 0.3503          | 0.3488         | 0.1574         | 0.8718              | 0.2703              | 9     |


### Framework versions

- Transformers 4.27.2
- TensorFlow 2.11.0
- Datasets 2.10.1
- Tokenizers 0.13.2