sft-base-12-epochs / README.md
ahmedabdelwahed's picture
Update README.md
421966a
|
raw
history blame
No virus
2.09 kB
---
library_name: peft
base_model: google/mt5-base
license: apache-2.0
language:
- ar
pipeline_tag: summarization
tags:
- summarization
- mt5
- pytorch
- transformers
---
# Mojiz
Mojiz is a finetuned MT5 model for Arabic summarization.
## Model Description
<!-- Provide a longer summary of what this model is. -->
## Usage
```python
from peft import PeftModel, PeftConfig
from transformers import AutoModelForSeq2SeqLM
config = PeftConfig.from_pretrained("ahmedabdelwahed/sft-base-12-epochs")
model = AutoModelForSeq2SeqLM.from_pretrained("google/mt5-base")
model = PeftModel.from_pretrained(model, "ahmedabdelwahed/sft-base-12-epochs")
```
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
### Framework versions
- PEFT 0.7.1