Instructions to use ExponentialScience/LedgerBERT with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use ExponentialScience/LedgerBERT with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="ExponentialScience/LedgerBERT")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("ExponentialScience/LedgerBERT") model = AutoModelForMaskedLM.from_pretrained("ExponentialScience/LedgerBERT") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- ff8ddd12d39a5d3f60b5fe5e3f77da9d3f3ea0678e4d6d55a121f66e1d58f9f4
- Size of remote file:
- 14.2 MB
- SHA256:
- bd49056465adc28f93ab7a1a463d220201148c641fa30f6f780b41f1ff1f296f
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.