File size: 3,338 Bytes
98abb99 d5f4df0 98abb99 2400fe9 98abb99 cddca4f 98abb99 0ce07fb 98abb99 d5f4df0 98abb99 d5f4df0 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 |
---
license: mit
datasets:
- dair-ai/emotion
language:
- en
library_name: transformers
widget:
- text: I am so happy with the results!
- text: I am so pissed with the results!
tags:
- debarta
- debarta-xlarge
- emotions-classifier
---
# π Emotion-X: Fine-tuned DeBERTa-Xlarge Based Emotion Detection π
This is a fine-tuned version of [microsoft/deberta-xlarge-mnli](https://huggingface.co/microsoft/deberta-xlarge-mnli) for emotion detection on the [dair-ai/emotion](https://huggingface.co/dair-ai/emotion) dataset.
## π Overview
Emotion-X is a state-of-the-art emotion detection model fine-tuned from Microsoft's DeBERTa-Xlarge model. Designed to accurately classify text into one of six emotional categories, Emotion-X leverages the robust capabilities of DeBERTa and fine-tunes it on a comprehensive emotion dataset, ensuring high accuracy and reliability.
## π Model Details
- **π Model Name:** `AnkitAI/deberta-xlarge-base-emotions-classifier`
- **π Base Model:** `microsoft/deberta-xlarge-mnli`
- **π Dataset:** [dair-ai/emotion](https://huggingface.co/dair-ai/emotion)
- **βοΈ Fine-tuning:** This model was fine-tuned for emotion detection with a classification head for six emotional categories (anger, disgust, fear, joy, sadness, surprise).
## ποΈ Training
The model was trained using the following parameters:
- **π§ Learning Rate:** 2e-5
- **π¦ Batch Size:** 4
- **βοΈ Weight Decay:** 0.01
- **π
Evaluation Strategy:** Epoch
### ποΈ Training Details
- **π Eval Loss:** 0.0858
- **β±οΈ Eval Runtime:** 110070.6349 seconds
- **π Eval Samples/Second:** 78.495
- **π Eval Steps/Second:** 2.453
- **π Train Loss:** 0.1049
- **β³ Eval Accuracy:** 94.6%
- **π Eval Precision:** 94.8%
- **β±οΈ Eval Recall:** 94.5%
- **π Eval F1 Score:** 94.7%
## π Usage
You can use this model directly with the Hugging Face `transformers` library:
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model_name = "AnkitAI/deberta-xlarge-base-emotions-classifier"
model = AutoModelForSequenceClassification.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
# Example usage
def predict_emotion(text):
inputs = tokenizer(text, return_tensors="pt", truncation=True, padding=True, max_length=128)
outputs = model(**inputs)
logits = outputs.logits
predictions = logits.argmax(dim=1)
return predictions
text = "I'm so happy with the results!"
emotion = predict_emotion(text)
print("Detected Emotion:", emotion)
```
## π Emotion Labels
- π Anger
- π€’ Disgust
- π¨ Fear
- π Joy
- π’ Sadness
- π² Surprise
## π Model Card Data
| Parameter | Value |
|-------------------------------|---------------------------|
| Model Name | microsoft/deberta-xlarge-mnli |
| Training Dataset | dair-ai/emotion |
| Number of Training Epochs | 3 |
| Learning Rate | 2e-5 |
| Per Device Train Batch Size | 4 |
| Evaluation Strategy | Epoch |
| Best Model Accuracy | 94.6% |
## π License
This model is licensed under the [MIT License](LICENSE). |