File size: 757 Bytes
b377b05
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
---
license: apache-2.0
size_categories:
- 100K<n<1M
---

# AntiBERTa Pretraining Data

## Description
Pretraining data for the [AntiBERTa](https://github.com/alchemab/antiberta) protein language model from [Alchemab Therapeutics](https://www.alchemab.com/).

## Citations

```
@article{Leem_Mitchell_Farmery_Barton_Galson_2022, title={Deciphering the language of antibodies using self-supervised learning}, volume={3}, ISSN={2666-3899}, url={https://www.cell.com/patterns/abstract/S2666-3899(22)00105-2}, DOI={10.1016/j.patter.2022.100513}, number={7}, journal={Patterns}, publisher={Elsevier}, author={Leem, Jinwoo and Mitchell, Laura S. and Farmery, James H. R. and Barton, Justin and Galson, Jacob D.}, year={2022}, month={Jul}, language={English} }
```