Arabic ModernBERT model partially trained (13% of one epoch)
on a filtered subset of
FineWeb2 (text length: 250-25000 characters, 98% or more Arabic words) pretokenized.
The actual filtered dataset (text column only) is here.
The dataset is a little over 30M records.
The model folder contains a checkpoint (64 batch size on single GPU, 60,000 iterations)
- Downloads last month
- 30
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no library tag.