File size: 12,262 Bytes
86bb4f6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6463812
 
 
 
86bb4f6
6463812
86bb4f6
6463812
86bb4f6
6463812
86bb4f6
6463812
86bb4f6
6463812
 
 
86bb4f6
6463812
 
 
86bb4f6
 
 
 
 
6463812
 
 
 
86bb4f6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
---
license: mit
tags:
- generated_from_trainer
model-index:
- name: bert_base_tcm_teste
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# bert_base_tcm_teste

This model is a fine-tuned version of [neuralmind/bert-base-portuguese-cased](https://huggingface.co/neuralmind/bert-base-portuguese-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0205
- Criterio Julgamento Precision: 0.7719
- Criterio Julgamento Recall: 0.8462
- Criterio Julgamento F1: 0.8073
- Criterio Julgamento Number: 104
- Data Sessao Precision: 0.7812
- Data Sessao Recall: 0.9091
- Data Sessao F1: 0.8403
- Data Sessao Number: 55
- Modalidade Licitacao Precision: 0.9507
- Modalidade Licitacao Recall: 0.9620
- Modalidade Licitacao F1: 0.9563
- Modalidade Licitacao Number: 421
- Numero Exercicio Precision: 0.9375
- Numero Exercicio Recall: 0.9730
- Numero Exercicio F1: 0.9549
- Numero Exercicio Number: 185
- Objeto Licitacao Precision: 0.5309
- Objeto Licitacao Recall: 0.7288
- Objeto Licitacao F1: 0.6143
- Objeto Licitacao Number: 59
- Valor Objeto Precision: 0.8409
- Valor Objeto Recall: 0.9024
- Valor Objeto F1: 0.8706
- Valor Objeto Number: 41
- Overall Precision: 0.8719
- Overall Recall: 0.9283
- Overall F1: 0.8992
- Overall Accuracy: 0.9967

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50.0

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Criterio Julgamento Precision | Criterio Julgamento Recall | Criterio Julgamento F1 | Criterio Julgamento Number | Data Sessao Precision | Data Sessao Recall | Data Sessao F1 | Data Sessao Number | Modalidade Licitacao Precision | Modalidade Licitacao Recall | Modalidade Licitacao F1 | Modalidade Licitacao Number | Numero Exercicio Precision | Numero Exercicio Recall | Numero Exercicio F1 | Numero Exercicio Number | Objeto Licitacao Precision | Objeto Licitacao Recall | Objeto Licitacao F1 | Objeto Licitacao Number | Valor Objeto Precision | Valor Objeto Recall | Valor Objeto F1 | Valor Objeto Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:-----------------------------:|:--------------------------:|:----------------------:|:--------------------------:|:---------------------:|:------------------:|:--------------:|:------------------:|:------------------------------:|:---------------------------:|:-----------------------:|:---------------------------:|:--------------------------:|:-----------------------:|:-------------------:|:-----------------------:|:--------------------------:|:-----------------------:|:-------------------:|:-----------------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| 0.0168        | 0.96  | 2750  | 0.0169          | 0.7016                        | 0.8365                     | 0.7632                 | 104                        | 0.6707                | 1.0                | 0.8029         | 55                 | 0.9424                         | 0.9715                      | 0.9567                  | 421                         | 0.9110                     | 0.9405                  | 0.9255              | 185                     | 0.3304                     | 0.6271                  | 0.4327              | 59                      | 0.76                   | 0.9268              | 0.8352          | 41                  | 0.8056            | 0.9249         | 0.8611     | 0.9950           |
| 0.0164        | 1.92  | 5500  | 0.0125          | 0.7565                        | 0.8365                     | 0.7945                 | 104                        | 0.6923                | 0.9818             | 0.8120         | 55                 | 0.9491                         | 0.9739                      | 0.9613                  | 421                         | 0.9375                     | 0.9730                  | 0.9549              | 185                     | 0.4138                     | 0.6102                  | 0.4932              | 59                      | 0.8085                 | 0.9268              | 0.8636          | 41                  | 0.8465            | 0.9306         | 0.8866     | 0.9965           |
| 0.0076        | 2.88  | 8250  | 0.0204          | 0.7184                        | 0.7115                     | 0.7150                 | 104                        | 0.8070                | 0.8364             | 0.8214         | 55                 | 0.9468                         | 0.9715                      | 0.9590                  | 421                         | 0.9282                     | 0.9784                  | 0.9526              | 185                     | 0.4783                     | 0.5593                  | 0.5156              | 59                      | 0.7209                 | 0.7561              | 0.7381          | 41                  | 0.8610            | 0.8948         | 0.8776     | 0.9961           |
| 0.0067        | 3.84  | 11000 | 0.0168          | 0.7589                        | 0.8173                     | 0.7870                 | 104                        | 0.8                   | 0.8                | 0.8000         | 55                 | 0.9487                         | 0.9667                      | 0.9576                  | 421                         | 0.9319                     | 0.9622                  | 0.9468              | 185                     | 0.5309                     | 0.7288                  | 0.6143              | 59                      | 0.8636                 | 0.9268              | 0.8941          | 41                  | 0.8717            | 0.9191         | 0.8948     | 0.9965           |
| 0.0043        | 4.8   | 13750 | 0.0144          | 0.736                         | 0.8846                     | 0.8035                 | 104                        | 0.8033                | 0.8909             | 0.8448         | 55                 | 0.9512                         | 0.9715                      | 0.9612                  | 421                         | 0.9316                     | 0.9568                  | 0.944               | 185                     | 0.5135                     | 0.6441                  | 0.5714              | 59                      | 0.8444                 | 0.9268              | 0.8837          | 41                  | 0.8681            | 0.9283         | 0.8972     | 0.9967           |
| 0.0072        | 5.76  | 16500 | 0.0161          | 0.8091                        | 0.8558                     | 0.8318                 | 104                        | 0.7237                | 1.0                | 0.8397         | 55                 | 0.9487                         | 0.9667                      | 0.9576                  | 421                         | 0.9326                     | 0.9730                  | 0.9524              | 185                     | 0.4318                     | 0.6441                  | 0.5170              | 59                      | 0.8222                 | 0.9024              | 0.8605          | 41                  | 0.8565            | 0.9318         | 0.8926     | 0.9966           |
| 0.003         | 6.72  | 19250 | 0.0205          | 0.7719                        | 0.8462                     | 0.8073                 | 104                        | 0.7812                | 0.9091             | 0.8403         | 55                 | 0.9507                         | 0.9620                      | 0.9563                  | 421                         | 0.9375                     | 0.9730                  | 0.9549              | 185                     | 0.5309                     | 0.7288                  | 0.6143              | 59                      | 0.8409                 | 0.9024              | 0.8706          | 41                  | 0.8719            | 0.9283         | 0.8992     | 0.9967           |
| 0.0033        | 7.68  | 22000 | 0.0197          | 0.7736                        | 0.7885                     | 0.7810                 | 104                        | 0.7463                | 0.9091             | 0.8197         | 55                 | 0.9466                         | 0.9691                      | 0.9577                  | 421                         | 0.9227                     | 0.9676                  | 0.9446              | 185                     | 0.5286                     | 0.6271                  | 0.5736              | 59                      | 0.7442                 | 0.7805              | 0.7619          | 41                  | 0.8650            | 0.9110         | 0.8874     | 0.9964           |
| 0.0043        | 8.64  | 24750 | 0.0250          | 0.7607                        | 0.8558                     | 0.8054                 | 104                        | 0.7612                | 0.9273             | 0.8361         | 55                 | 0.9400                         | 0.9667                      | 0.9532                  | 421                         | 0.9427                     | 0.9784                  | 0.9602              | 185                     | 0.5479                     | 0.6780                  | 0.6061              | 59                      | 0.8043                 | 0.9024              | 0.8506          | 41                  | 0.8675            | 0.9306         | 0.8979     | 0.9965           |
| 0.0014        | 9.61  | 27500 | 0.0257          | 0.8018                        | 0.8558                     | 0.8279                 | 104                        | 0.7391                | 0.9273             | 0.8226         | 55                 | 0.9417                         | 0.9596                      | 0.9506                  | 421                         | 0.9372                     | 0.9676                  | 0.9521              | 185                     | 0.5143                     | 0.6102                  | 0.5581              | 59                      | 0.8                    | 0.8780              | 0.8372          | 41                  | 0.8689            | 0.9191         | 0.8933     | 0.9966           |
| 0.0025        | 10.57 | 30250 | 0.0258          | 0.7798                        | 0.8173                     | 0.7981                 | 104                        | 0.7424                | 0.8909             | 0.8099         | 55                 | 0.9465                         | 0.9667                      | 0.9565                  | 421                         | 0.9424                     | 0.9730                  | 0.9574              | 185                     | 0.5352                     | 0.6441                  | 0.5846              | 59                      | 0.8222                 | 0.9024              | 0.8605          | 41                  | 0.8728            | 0.9202         | 0.8959     | 0.9963           |
| 0.0016        | 11.53 | 33000 | 0.0273          | 0.7925                        | 0.8077                     | 0.8000                 | 104                        | 0.7246                | 0.9091             | 0.8065         | 55                 | 0.9485                         | 0.9620                      | 0.9552                  | 421                         | 0.9282                     | 0.9784                  | 0.9526              | 185                     | 0.56                       | 0.7119                  | 0.6269              | 59                      | 0.8409                 | 0.9024              | 0.8706          | 41                  | 0.8723            | 0.9237         | 0.8972     | 0.9964           |


### Framework versions

- Transformers 4.21.0.dev0
- Pytorch 1.11.0+cu113
- Datasets 2.3.2
- Tokenizers 0.12.1