mohamedemam commited on
Commit
29af186
1 Parent(s): 0ae1bb4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -2
README.md CHANGED
@@ -33,8 +33,15 @@ alt="drawing" width="600"/>
33
  8. [Citation](#citation)
34
  9. [Model Card Authors](#model-card-authors)
35
  # my fine tuned model
36
- >This model is fine tuned to generate a question with answers from a context , why is can be very usful this can help you to generate a dataset from a book article any thing you would to make from it dataset and train another model on this dataset
37
-
 
 
 
 
 
 
 
38
  # TL;DR
39
 
40
  If you already know T5, FLAN-T5 is just better at everything. For the same number of parameters, these models have been fine-tuned on more than 1000 additional tasks covering also more languages.
 
33
  8. [Citation](#citation)
34
  9. [Model Card Authors](#model-card-authors)
35
  # my fine tuned model
36
+ >This model is fine tuned to generate a question with answers from a context , why that can be very usful this can help you to generate a dataset from a book article any thing you would to make from it dataset and train another model on this dataset , give the model any context with pre prometed of quation you want + context and it will extarct question + answer for you
37
+ this are promted i use
38
+ >[ "which", "how", "when", "where", "who", "whom", "whose", "why",
39
+ "which", "who", "whom", "whose", "whereas",
40
+ "can", "could", "may", "might", "will", "would", "shall", "should",
41
+ "do", "does", "did", "is", "are", "am", "was", "were", "be", "being", "been",
42
+ "have", "has", "had", "if", "is", "are", "am", "was", "were", "do", "does", "did", "can", "could",
43
+ "will", "would", "shall", "should", "might", "may", "must",
44
+ "may", "might", "must"]
45
  # TL;DR
46
 
47
  If you already know T5, FLAN-T5 is just better at everything. For the same number of parameters, these models have been fine-tuned on more than 1000 additional tasks covering also more languages.