lhallee commited on
Commit
2ef2239
·
1 Parent(s): 0e1538c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -21,7 +21,7 @@ widget:
21
 
22
  ## Model description
23
 
24
- [cdsBERT](https://doi.org/10.1101/2023.09.15.558027) is pLM with a codon vocabulary that was seeded with [ProtBERT](https://huggingface.co/Rostlab/prot_bert_bfd) and trained with a novel vocabulary extension pipeline called MELD. cdsBERT offers a highly biologically relevant latent space with excellent EC number prediction surpassing ProtBERT.
25
 
26
  ## How to use
27
 
@@ -30,7 +30,7 @@ widget:
30
  import re
31
  import torch
32
  import torch.nn.functional as F
33
- from transformers import BertForMaskedLM, BertTokenizer
34
 
35
  model = BertModel.from_pretrained('lhallee/cdsBERT') # load model
36
  tokenizer = BertTokenizer.from_pretrained('lhallee/cdsBERT') # load tokenizer
 
21
 
22
  ## Model description
23
 
24
+ [cdsBERT+](https://doi.org/10.1101/2023.09.15.558027) is pLM with a codon vocabulary that was seeded with [ProtBERT](https://huggingface.co/Rostlab/prot_bert_bfd) and trained with a novel vocabulary extension pipeline called MELD. cdsBERT+ offers a highly biologically relevant latent space with excellent EC number prediction surpassing ProtBERT.
25
 
26
  ## How to use
27
 
 
30
  import re
31
  import torch
32
  import torch.nn.functional as F
33
+ from transformers import BertModel, BertTokenizer
34
 
35
  model = BertModel.from_pretrained('lhallee/cdsBERT') # load model
36
  tokenizer = BertTokenizer.from_pretrained('lhallee/cdsBERT') # load tokenizer