AraModernBert For Topic Classification

Overview

This is an Experimental Arabic Model demonstrates how ModernBERT can be adapted to Arabic for tasks like topic classification.

This is an Experimental Arabic version of ModernBERT-base, trained ONLY on Topic Classification Task using the base model of original modernbert with a custom Arabic trained tokenizer with the following details:

  • Dataset: Arabic Wikipedia
  • Size: 1.8 GB
  • Tokens: 228,788,529 tokens

This model demonstrates how ModernBERT can be adapted to Arabic for tasks like topic classification.

Model Eval Details

  • Epochs: 3
  • Evaluation Metrics:
    • F1 Score: 0.95
    • Loss: 0.1998
  • Training Step: 47,862

Dataset Used For Training:

  • SANAD DATASET was used for training and testing which contains 7 different topics such as Politics, Finance, Medical, Culture, Sport , Tech and Religion.

How to Use

The model can be used for text classification using the transformers library. Below is an example:

from transformers import pipeline

# Load model from huggingface.co/models using our repository ID
classifier = pipeline(
    task="text-classification",
    model="Omartificial-Intelligence-Space/AraModernBert-Topic-Classifier",
)

sample = '''
PUT SOME TEXT HERE TO CLASSIFY ITS TOPIC
'''
classifier(sample)

# [{'label': 'health', 'score': 0.6779336333274841}]

Test Phase Results:

  • The model was evalauted on Test Set of 14181 examples of different topics, the distrubtion of these topics are:

image/png

  • The model achieved the follwoing accuracy for prediction on this test set:

image/png

Citation

@misc{modernbert,
      title={Smarter, Better, Faster, Longer: A Modern Bidirectional Encoder for Fast, Memory Efficient, and Long Context Finetuning and Inference}, 
      author={Benjamin Warner and Antoine Chaffin and Benjamin Clavié and Orion Weller and Oskar Hallström and Said Taghadouini and Alexis Gallagher and Raja Biswas and Faisal Ladhak and Tom Aarsen and Nathan Cooper and Griffin Adams and Jeremy Howard and Iacopo Poli},
      year={2024},
      eprint={2412.13663},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2412.13663}, 
}
Downloads last month
2
Safetensors
Model size
150M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for NAMAA-Space/AraModernBert-Topic-Classifier

Finetuned
(150)
this model

Dataset used to train NAMAA-Space/AraModernBert-Topic-Classifier