eduardosoares99
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -20,9 +20,7 @@ library_name: transformers
|
|
20 |
|
21 |
This repository provides PyTorch source code associated with our publication, "A Mamba-Based Foundation Model for Chemistry".
|
22 |
|
23 |
-
**Paper:** [Arxiv Link](https://openreview.net/pdf?id=HTgCs0KSTl)
|
24 |
-
|
25 |
-
**HuggingFace:** [HuggingFace Link](https://huggingface.co/ibm/materials.smi_ssed)
|
26 |
|
27 |
For more information contact: [email protected] or [email protected].
|
28 |
|
@@ -54,7 +52,7 @@ For more information contact: [email protected] or [email protected].
|
|
54 |
|
55 |
### Pretrained Models and Training Logs
|
56 |
|
57 |
-
We provide checkpoints of the SMI-SSED model pre-trained on a dataset of ~91M molecules curated from PubChem. The pre-trained model shows competitive performance on classification and regression benchmarks from MoleculeNet.
|
58 |
|
59 |
Add the SMI-SSED `pre-trained weights.pt` to the `inference/` or `finetune/` directory according to your needs. The directory structure should look like the following:
|
60 |
|
|
|
20 |
|
21 |
This repository provides PyTorch source code associated with our publication, "A Mamba-Based Foundation Model for Chemistry".
|
22 |
|
23 |
+
**Paper NeurIPS AI4Mat 2024:** [Arxiv Link](https://openreview.net/pdf?id=HTgCs0KSTl)
|
|
|
|
|
24 |
|
25 |
For more information contact: [email protected] or [email protected].
|
26 |
|
|
|
52 |
|
53 |
### Pretrained Models and Training Logs
|
54 |
|
55 |
+
We provide checkpoints of the SMI-SSED model pre-trained on a dataset of ~91M molecules curated from PubChem. The pre-trained model shows competitive performance on classification and regression benchmarks from MoleculeNet.
|
56 |
|
57 |
Add the SMI-SSED `pre-trained weights.pt` to the `inference/` or `finetune/` directory according to your needs. The directory structure should look like the following:
|
58 |
|