metadata
license: mit
Arabic ModernBERT model partially trained (13% of one epoch)
on a filtered subset of
FineWeb2 (text length: 250-25000 characters, 98% or more Arabic words) pretokenized.
The actual filtered dataset (text column only) is here.
The dataset is a little over 30M records.
The model folder contains a checkpoint (64 batch size on single GPU, 60,000 iterations)