timotheeplanes louisbrulenaudet commited on
Commit
7c65df8
1 Parent(s): 2b58000

Update README.md (#2)

Browse files

- Update README.md (976d730da7ecfab1af0f99a1c4af5e95ab620b61)


Co-authored-by: Louis Brulé Naudet <[email protected]>

Files changed (1) hide show
  1. README.md +15 -13
README.md CHANGED
@@ -18,7 +18,10 @@ library_name: sentence-transformers
18
 
19
  This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
20
 
21
- <!--- Describe your model here -->
 
 
 
22
 
23
  ## Usage (Sentence-Transformers)
24
 
@@ -34,13 +37,12 @@ Then you can use the model like this:
34
  from sentence_transformers import SentenceTransformer
35
  sentences = ["This is an example sentence", "Each sentence is converted"]
36
 
37
- model = SentenceTransformer('{MODEL_NAME}')
38
  embeddings = model.encode(sentences)
39
  print(embeddings)
40
  ```
41
 
42
 
43
-
44
  ## Usage (HuggingFace Transformers)
45
  Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
46
 
@@ -75,14 +77,6 @@ print(sentence_embeddings)
75
  ```
76
 
77
 
78
-
79
- ## Evaluation Results
80
-
81
- <!--- Describe how your model was evaluated -->
82
-
83
- For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
84
-
85
-
86
  ## Training
87
  The model was trained with the parameters:
88
 
@@ -102,7 +96,6 @@ Parameters of the fit()-Method:
102
  {
103
  "epochs": 1,
104
  "evaluation_steps": 0,
105
- "evaluator": "NoneType",
106
  "max_grad_norm": 1,
107
  "optimizer_class": "<class 'torch.optim.adamw.AdamW'>",
108
  "optimizer_params": {
@@ -126,4 +119,13 @@ SentenceTransformer(
126
 
127
  ## Citing & Authors
128
 
129
- <!--- Describe where people can find more information -->
 
 
 
 
 
 
 
 
 
 
18
 
19
  This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
20
 
21
+ Pretrained transformers model with the largest Wikipedia using a masked language modeling (MLM) objective, fitted using Transformer-based Sequential Denoising Auto-Encoder for unsupervised sentence embedding learning with one objective : anti-doping domain adaptation.
22
+
23
+ This way, the model learns an inner representation of the anti-doping language in the training set that can then be used to extract features useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard classifier using the features produced by the model as inputs.
24
+
25
 
26
  ## Usage (Sentence-Transformers)
27
 
 
37
  from sentence_transformers import SentenceTransformer
38
  sentences = ["This is an example sentence", "Each sentence is converted"]
39
 
40
+ model = SentenceTransformer("timotheeplanes/anti-doping-bert-base")
41
  embeddings = model.encode(sentences)
42
  print(embeddings)
43
  ```
44
 
45
 
 
46
  ## Usage (HuggingFace Transformers)
47
  Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
48
 
 
77
  ```
78
 
79
 
 
 
 
 
 
 
 
 
80
  ## Training
81
  The model was trained with the parameters:
82
 
 
96
  {
97
  "epochs": 1,
98
  "evaluation_steps": 0,
 
99
  "max_grad_norm": 1,
100
  "optimizer_class": "<class 'torch.optim.adamw.AdamW'>",
101
  "optimizer_params": {
 
119
 
120
  ## Citing & Authors
121
 
122
+ If you use this code in your research, please use the following BibTeX entry.
123
+
124
+ ```BibTeX
125
+ @misc{louisbrulenaudet2023,
126
+ author = {Brulé Naudet (L.), Planes (T.).},
127
+ title = {Domain-adapted BERT for anti-doping practice},
128
+ year = {2023}
129
+ howpublished = {\url{https://huggingface.co/timotheeplanes/anti-doping-bert-base}},
130
+ }
131
+ ```