|
--- |
|
datasets: |
|
- jigsaw_unintended_bias |
|
language: |
|
- en |
|
pipeline_tag: fill-mask |
|
tags: |
|
- code |
|
- bert |
|
- non-toxic |
|
- nontoxicbert |
|
--- |
|
|
|
# Description |
|
|
|
CivilBert is a further finetuned model of bert-based-uncased using only the non-toxic data of the Jigsaw-unintended-bias dataset to make the predicted tokens less toxic. |
|
We are working further to make this model better and less toxic. |
|
|
|
# You can use it directly using the transformers: |
|
* from transformers import AutoTokenizer, AutoModelForMaskedLM |
|
* tokenizer = AutoTokenizer.from_pretrained("Ashokajou51/NonToxicCivilBert") |
|
* model = AutoModelForMaskedLM.from_pretrained("Ashokajou51/NonToxicCivilBert") |