SciLinkBERT: A Language Model for Understanding Scientific Texts with Citation Information
Introduction
SciLinkBERT is a BERT-based pre-trained language model specifically designed to enhance the understanding of scientific texts by incorporating citation information. This model is particularly useful in scientific domains, where understanding complex language and extracting meaningful information from citations is crucial. [PDF]
References and Additional Resources
For more advanced fine-tuning and implementation details, you can refer to the following repositories:
- LinkBERT: Provides an example of how citation links and other scientific data can be incorporated into BERT models.
- SciDeBERTa-Fine-Tuning: This repository demonstrates fine-tuning approaches on scientific datasets which can be adapted for SciLinkBERT.
How to Cite
If you use SciLinkBERT in your research, please cite the following paper:
@article{Yu2024SciLinkBERT,
title={SciLinkBERT: A Language Model for Understanding Scientific Texts with Citation Information},
author={Ju-Yeon Yu, Donghun Yang, Kyong-Ha Lee},
journal={IEEE Access},
year={2024},
doi={10.1109/ACCESS.2017.DOI},
}
Contributing
Contributions to SciLinkBERT are welcome! If you find any issues or have suggestions for improvements, feel free to open an issue or submit a pull request.
License
This project is licensed under the MIT License.
- Downloads last month
- 7
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support