File size: 941 Bytes
ac6b336
 
 
 
 
 
 
b43fe35
 
 
 
 
 
 
 
 
7592b37
b43fe35
 
 
 
 
 
 
 
 
 
6ee3e45
b43fe35
 
 
ac6b336
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
---
license: apache-2.0
language:
- en
pipeline_tag: question-answering
---
[bioformer-8L](https://huggingface.co/bioformers/bioformer-8L) fined-tuned on the [SQuAD1](https://rajpurkar.github.io/SQuAD-explorer) dataset for 3 epochs.

The fine-tuning process was performed on a single P100 GPUs (16GB). The hyperparameters are:

```
max_seq_length=512
per_device_train_batch_size=16
gradient_accumulation_steps=1
total train batch size (w. parallel, distributed & accumulation) = 16
learning_rate=3e-5
num_train_epochs=3
```

## Evaluation results

```
"eval_exact_match": 78.55250709555345
"eval_f1": 85.91482799690257
```

Bioformer's performance is on par with [DistilBERT](https://arxiv.org/pdf/1910.01108.pdf) (EM/F1: 77.7/85.8), 
although Bioformer was pretrained only on biomedical texts. 


## Speed
In our experiments, the inference speed of Bioformer is 3x as fast as BERT-base/BioBERT/PubMedBERT, and is 40% faster than DistilBERT.