Arabic Question Generation Model
This model is ready to use for Question Generation task, simply input the text and answer, the model will generate a question, This model is a fine-tuned version of AraT5-Base
Live Demo
Get the Question from given Context and a Answer : Arabic QG Model
Model in Action 🚀
#Requirements !pip install transformers
from transformers import AutoTokenizer,AutoModelForSeq2SeqLM
model = AutoModelForSeq2SeqLM.from_pretrained("Mihakram/AraT5-base-question-generation")
tokenizer = AutoTokenizer.from_pretrained("Mihakram/AraT5-base-question-generation")
def get_question(context,answer):
text="context: " +context + " " + "answer: " + answer + " </s>"
text_encoding = tokenizer.encode_plus(
text,return_tensors="pt"
)
model.eval()
generated_ids = model.generate(
input_ids=text_encoding['input_ids'],
attention_mask=text_encoding['attention_mask'],
max_length=64,
num_beams=5,
num_return_sequences=1
)
return tokenizer.decode(generated_ids[0],skip_special_tokens=True,clean_up_tokenization_spaces=True).replace('question: ',' ')
context="الثورة الجزائرية أو ثورة المليون شهيد، اندلعت في 1 نوفمبر 1954 ضد المستعمر الفرنسي ودامت 7 سنوات ونصف. استشهد فيها أكثر من مليون ونصف مليون جزائري"
answer =" 7 سنوات ونصف"
get_question(context,answer)
#output : question="كم استمرت الثورة الجزائرية؟ "
Citation
If you want to cite this model you can use this:
Contacts
Mihoubi Akram Fawzi: Linkedin | Github | [email protected]
Ibrir Adel: Linkedin | Github | [email protected]
- Downloads last month
- 22
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Spaces using Mihakram/AraT5-base-question-generation 2
Evaluation results
- Bleu1self-reported37.620
- Bleu2self-reported27.800
- Bleu3self-reported20.890
- Bleu4self-reported15.870
- meteorself-reported33.190
- rougelself-reported43.370