File size: 1,044 Bytes
50e4378
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
# My Toxicity Debiaser Pipeline

This custom pipeline debiases toxic text using a toxicity classifier and GPT-2.

## Usage

To use this pipeline, you first need to download the required models and tokenizers, and then import the `MyToxicityDebiaserPipeline` class:

```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification, GPT2LMHeadModel, GPT2Tokenizer
from my_toxicity_debiaser import MyToxicityDebiaserPipeline

toxicity_model_name = "shainaraza/toxity_classify_debiaser"
gpt_model_name = "gpt2"

toxicity_tokenizer = AutoTokenizer.from_pretrained(toxicity_model_name)
toxicity_model = AutoModelForSequenceClassification.from_pretrained(toxicity_model_name)

gpt_tokenizer = GPT2Tokenizer.from_pretrained(gpt_model_name)
gpt_model = GPT2LMHeadModel.from_pretrained(gpt_model_name)

pipeline = MyToxicityDebiaserPipeline(
    model=toxicity_model,
    tokenizer=toxicity_tokenizer,
    gpt_model=gpt_model,
    gpt_tokenizer=gpt_tokenizer,
)

text = "Your example text here"
result = pipeline(text)
print(result)