up
Browse files
README.md
CHANGED
@@ -34,8 +34,7 @@ model-index:
|
|
34 |
|
35 |
# Model Card for Clinical Mosaic
|
36 |
|
37 |
-
Clinical Mosaic is a transformer-based language model built on the Mosaic BERT architecture.
|
38 |
-
|
39 |
## Model Details
|
40 |
- **Developed by:** Sifal Klioui, Sana Sellami, and Youssef Trardi (Aix-Marseille Univ, LIS, CNRS, Marseille, France)
|
41 |
- **Funded by:** PICOMALE project (AMIDEX) Under the direction of the CEDRE
|
|
|
34 |
|
35 |
# Model Card for Clinical Mosaic
|
36 |
|
37 |
+
Clinical Mosaic is a transformer-based language model designed for clinical text, built on the Mosaic BERT architecture. The model is pretrained on 331,794 deidentified clinical notes from the MIMIC-IV-NOTES 2.2 database, with a sequence length of 512 tokens, while leveraging Attention with Linear Biases (ALiBi) to improve extrapolation beyond this limit without re-quiring learned positional embeddings.
|
|
|
38 |
## Model Details
|
39 |
- **Developed by:** Sifal Klioui, Sana Sellami, and Youssef Trardi (Aix-Marseille Univ, LIS, CNRS, Marseille, France)
|
40 |
- **Funded by:** PICOMALE project (AMIDEX) Under the direction of the CEDRE
|