--- title: Submission Template emoji: 🔥 colorFrom: yellow colorTo: green sdk: docker pinned: false --- ## Model Description This space is related to the text task of the Frugal AI Challenge. The final model used is a ModernBERT trained on a mix of approximately 95,000 samples, consisting of both real and synthetic data. The dataset was open-sourced at MatthiasPicard/Frugal-AI-Train-Data-88k. The fine-tuned model, along with training logs, was open-sourced at MatthiasPicard/ModernBERT_frugal_88k. To optimize inference time, we implemented dynamic padding with a batch size of 16. The model was converted to FP16 for reduced memory usage and faster inference. We experimented with different inference scripts, using both Hugging Face's native code and pure PyTorch implementations. The latter was found to be the most efficient. We also explored pruning and TensorRT, but these solutions worsened the trade-off between frugality and performance, so we did not adopt them. ### Note: The inference script includes both model and tokenizer loading. As a result, the first evaluation of our model in the submission space will consume more energy than subsequent evaluations. ### Labels 0. No relevant claim detected 1. Global warming is not happening 2. Not caused by humans 3. Not bad or beneficial 4. Solutions harmful/unnecessary 5. Science is unreliable 6. Proponents are biased 7. Fossil fuels are needed