dapSciBERT

DapSciBERT is a BERT-like model trained based on the domain adaptive pretraining method (Gururangan et al.) for the patent domain. Allenai/scibert_scivocab_uncased is used as base for the training. The training dataset used consists of a corpus of 10,000,000 patent abstracts that have been filed between 1998-2020 in US and European patent offices as well as the World Intellectual Property Organization.

Downloads last month
121
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.