Datasets:

Tasks:
Other
Modalities:
Text
Formats:
parquet
Languages:
English
ArXiv:
Libraries:
Datasets
Dask
License:
VictorSanh commited on
Commit
0d6d527
1 Parent(s): 3444e31

update point of contact + data splits sizes

Browse files
Files changed (2) hide show
  1. README.md +1 -1
  2. data_splits.csv +661 -0
README.md CHANGED
@@ -42,7 +42,7 @@ task_categories:
42
  - **Homepage:** https://bigscience.huggingface.co/promptsource
43
  - **Repository:** https://github.com/bigscience-workshop/promptsource/
44
  - **Paper:** TODO
45
- - **Point of Contact:** Victor Sanh ([email protected])
46
 
47
  ### Dataset Summary
48
 
 
42
  - **Homepage:** https://bigscience.huggingface.co/promptsource
43
  - **Repository:** https://github.com/bigscience-workshop/promptsource/
44
  - **Paper:** TODO
45
+ - **Point of Contact:** [Victor Sanh](mailto:[email protected])
46
 
47
  ### Dataset Summary
48
 
data_splits.csv ADDED
@@ -0,0 +1,661 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Data(sub)set|Number of examples per splits
2
+ adversarial_qa_dbert_answer_the_following_q|{'train': 10000, 'validation': 1000}
3
+ adversarial_qa_dbert_based_on|{'train': 10000, 'validation': 1000}
4
+ adversarial_qa_dbert_generate_question|{'train': 10000, 'validation': 1000, 'test': 1000}
5
+ adversarial_qa_dbert_question_context_answer|{'train': 10000, 'validation': 1000}
6
+ adversarial_qa_dbert_tell_what_it_is|{'train': 10000, 'validation': 1000}
7
+ adversarial_qa_dbidaf_answer_the_following_q|{'train': 10000, 'validation': 1000}
8
+ adversarial_qa_dbidaf_based_on|{'train': 10000, 'validation': 1000}
9
+ adversarial_qa_dbidaf_generate_question|{'train': 10000, 'validation': 1000, 'test': 1000}
10
+ adversarial_qa_dbidaf_question_context_answer|{'train': 10000, 'validation': 1000}
11
+ adversarial_qa_dbidaf_tell_what_it_is|{'train': 10000, 'validation': 1000}
12
+ adversarial_qa_droberta_answer_the_following_q|{'train': 10000, 'validation': 1000}
13
+ adversarial_qa_droberta_based_on|{'train': 10000, 'validation': 1000}
14
+ adversarial_qa_droberta_generate_question|{'train': 10000, 'validation': 1000, 'test': 1000}
15
+ adversarial_qa_droberta_question_context_answer|{'train': 10000, 'validation': 1000}
16
+ adversarial_qa_droberta_tell_what_it_is|{'train': 10000, 'validation': 1000}
17
+ ag_news_classify|{'train': 120000, 'test': 7600}
18
+ ag_news_classify_question_first|{'train': 120000, 'test': 7600}
19
+ ag_news_classify_with_choices|{'train': 120000, 'test': 7600}
20
+ ag_news_classify_with_choices_question_first|{'train': 120000, 'test': 7600}
21
+ ag_news_recommend|{'train': 120000, 'test': 7600}
22
+ ag_news_which_section|{'train': 120000, 'test': 7600}
23
+ ag_news_which_section_choices|{'train': 120000, 'test': 7600}
24
+ ai2_arc_ARC_Challenge_heres_a_problem|{'train': 1119, 'validation': 299, 'test': 1172}
25
+ ai2_arc_ARC_Challenge_i_am_hesitating|{'train': 1119, 'validation': 299, 'test': 1172}
26
+ ai2_arc_ARC_Challenge_multiple_choice|{'train': 1119, 'validation': 299, 'test': 1172}
27
+ ai2_arc_ARC_Challenge_pick_false_options|{'train': 1119, 'validation': 299, 'test': 1172}
28
+ ai2_arc_ARC_Challenge_pick_the_most_correct_option|{'train': 1119, 'validation': 299, 'test': 1172}
29
+ ai2_arc_ARC_Challenge_qa_options|{'train': 1119, 'validation': 299, 'test': 1172}
30
+ ai2_arc_ARC_Easy_heres_a_problem|{'train': 2251, 'validation': 570, 'test': 2376}
31
+ ai2_arc_ARC_Easy_i_am_hesitating|{'train': 2251, 'validation': 570, 'test': 2376}
32
+ ai2_arc_ARC_Easy_multiple_choice|{'train': 2251, 'validation': 570, 'test': 2376}
33
+ ai2_arc_ARC_Easy_pick_false_options|{'train': 2251, 'validation': 570, 'test': 2376}
34
+ ai2_arc_ARC_Easy_pick_the_most_correct_option|{'train': 2251, 'validation': 570, 'test': 2376}
35
+ ai2_arc_ARC_Easy_qa_options|{'train': 2251, 'validation': 570, 'test': 2376}
36
+ amazon_polarity_Is_this_product_review_positive|{'train': 3600000, 'test': 400000}
37
+ amazon_polarity_Is_this_review|{'train': 3600000, 'test': 400000}
38
+ amazon_polarity_Is_this_review_negative|{'train': 3600000, 'test': 400000}
39
+ amazon_polarity_User_recommend_this_product|{'train': 3600000, 'test': 400000}
40
+ amazon_polarity_convey_negative_or_positive_sentiment|{'train': 3600000, 'test': 400000}
41
+ amazon_polarity_flattering_or_not|{'train': 3600000, 'test': 400000}
42
+ amazon_polarity_negative_or_positive_tone|{'train': 3600000, 'test': 400000}
43
+ amazon_polarity_user_satisfied|{'train': 3600000, 'test': 400000}
44
+ amazon_polarity_would_you_buy|{'train': 3600000, 'test': 400000}
45
+ anli_GPT_3_style_r1|{'train': 16946, 'validation': 1000, 'test': 1000}
46
+ anli_GPT_3_style_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}
47
+ anli_GPT_3_style_r2|{'train': 45460, 'validation': 1000, 'test': 1000}
48
+ anli_GPT_3_style_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}
49
+ anli_GPT_3_style_r3|{'train': 100459, 'validation': 1200, 'test': 1200}
50
+ anli_GPT_3_style_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}
51
+ anli_MNLI_crowdsource_r1|{'train': 16946, 'validation': 1000, 'test': 1000}
52
+ anli_MNLI_crowdsource_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}
53
+ anli_MNLI_crowdsource_r2|{'train': 45460, 'validation': 1000, 'test': 1000}
54
+ anli_MNLI_crowdsource_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}
55
+ anli_MNLI_crowdsource_r3|{'train': 100459, 'validation': 1200, 'test': 1200}
56
+ anli_MNLI_crowdsource_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}
57
+ anli_always_sometimes_never_r1|{'train': 16946, 'validation': 1000, 'test': 1000}
58
+ anli_always_sometimes_never_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}
59
+ anli_always_sometimes_never_r2|{'train': 45460, 'validation': 1000, 'test': 1000}
60
+ anli_always_sometimes_never_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}
61
+ anli_always_sometimes_never_r3|{'train': 100459, 'validation': 1200, 'test': 1200}
62
+ anli_always_sometimes_never_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}
63
+ anli_based_on_the_previous_passage_r1|{'train': 16946, 'validation': 1000, 'test': 1000}
64
+ anli_based_on_the_previous_passage_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}
65
+ anli_based_on_the_previous_passage_r2|{'train': 45460, 'validation': 1000, 'test': 1000}
66
+ anli_based_on_the_previous_passage_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}
67
+ anli_based_on_the_previous_passage_r3|{'train': 100459, 'validation': 1200, 'test': 1200}
68
+ anli_based_on_the_previous_passage_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}
69
+ anli_can_we_infer_r1|{'train': 16946, 'validation': 1000, 'test': 1000}
70
+ anli_can_we_infer_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}
71
+ anli_can_we_infer_r2|{'train': 45460, 'validation': 1000, 'test': 1000}
72
+ anli_can_we_infer_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}
73
+ anli_can_we_infer_r3|{'train': 100459, 'validation': 1200, 'test': 1200}
74
+ anli_can_we_infer_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}
75
+ anli_claim_true_false_inconclusive_r1|{'train': 16946, 'validation': 1000, 'test': 1000}
76
+ anli_claim_true_false_inconclusive_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}
77
+ anli_claim_true_false_inconclusive_r2|{'train': 45460, 'validation': 1000, 'test': 1000}
78
+ anli_claim_true_false_inconclusive_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}
79
+ anli_claim_true_false_inconclusive_r3|{'train': 100459, 'validation': 1200, 'test': 1200}
80
+ anli_claim_true_false_inconclusive_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}
81
+ anli_consider_always_sometimes_never_r1|{'train': 16946, 'validation': 1000, 'test': 1000}
82
+ anli_consider_always_sometimes_never_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}
83
+ anli_consider_always_sometimes_never_r2|{'train': 45460, 'validation': 1000, 'test': 1000}
84
+ anli_consider_always_sometimes_never_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}
85
+ anli_consider_always_sometimes_never_r3|{'train': 100459, 'validation': 1200, 'test': 1200}
86
+ anli_consider_always_sometimes_never_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}
87
+ anli_does_it_follow_that_r1|{'train': 16946, 'validation': 1000, 'test': 1000}
88
+ anli_does_it_follow_that_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}
89
+ anli_does_it_follow_that_r2|{'train': 45460, 'validation': 1000, 'test': 1000}
90
+ anli_does_it_follow_that_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}
91
+ anli_does_it_follow_that_r3|{'train': 100459, 'validation': 1200, 'test': 1200}
92
+ anli_does_it_follow_that_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}
93
+ anli_does_this_imply_r1|{'train': 16946, 'validation': 1000, 'test': 1000}
94
+ anli_does_this_imply_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}
95
+ anli_does_this_imply_r2|{'train': 45460, 'validation': 1000, 'test': 1000}
96
+ anli_does_this_imply_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}
97
+ anli_does_this_imply_r3|{'train': 100459, 'validation': 1200, 'test': 1200}
98
+ anli_does_this_imply_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}
99
+ anli_guaranteed_possible_impossible_r1|{'train': 16946, 'validation': 1000, 'test': 1000}
100
+ anli_guaranteed_possible_impossible_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}
101
+ anli_guaranteed_possible_impossible_r2|{'train': 45460, 'validation': 1000, 'test': 1000}
102
+ anli_guaranteed_possible_impossible_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}
103
+ anli_guaranteed_possible_impossible_r3|{'train': 100459, 'validation': 1200, 'test': 1200}
104
+ anli_guaranteed_possible_impossible_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}
105
+ anli_guaranteed_true_r1|{'train': 16946, 'validation': 1000, 'test': 1000}
106
+ anli_guaranteed_true_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}
107
+ anli_guaranteed_true_r2|{'train': 45460, 'validation': 1000, 'test': 1000}
108
+ anli_guaranteed_true_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}
109
+ anli_guaranteed_true_r3|{'train': 100459, 'validation': 1200, 'test': 1200}
110
+ anli_guaranteed_true_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}
111
+ anli_justified_in_saying_r1|{'train': 16946, 'validation': 1000, 'test': 1000}
112
+ anli_justified_in_saying_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}
113
+ anli_justified_in_saying_r2|{'train': 45460, 'validation': 1000, 'test': 1000}
114
+ anli_justified_in_saying_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}
115
+ anli_justified_in_saying_r3|{'train': 100459, 'validation': 1200, 'test': 1200}
116
+ anli_justified_in_saying_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}
117
+ anli_must_be_true_r1|{'train': 16946, 'validation': 1000, 'test': 1000}
118
+ anli_must_be_true_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}
119
+ anli_must_be_true_r2|{'train': 45460, 'validation': 1000, 'test': 1000}
120
+ anli_must_be_true_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}
121
+ anli_must_be_true_r3|{'train': 100459, 'validation': 1200, 'test': 1200}
122
+ anli_must_be_true_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}
123
+ anli_should_assume_r1|{'train': 16946, 'validation': 1000, 'test': 1000}
124
+ anli_should_assume_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}
125
+ anli_should_assume_r2|{'train': 45460, 'validation': 1000, 'test': 1000}
126
+ anli_should_assume_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}
127
+ anli_should_assume_r3|{'train': 100459, 'validation': 1200, 'test': 1200}
128
+ anli_should_assume_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}
129
+ anli_take_the_following_as_truth_r1|{'train': 16946, 'validation': 1000, 'test': 1000}
130
+ anli_take_the_following_as_truth_r1_score_eval|{'train': 50838, 'validation': 3000, 'test': 3000}
131
+ anli_take_the_following_as_truth_r2|{'train': 45460, 'validation': 1000, 'test': 1000}
132
+ anli_take_the_following_as_truth_r2_score_eval|{'train': 136380, 'validation': 3000, 'test': 3000}
133
+ anli_take_the_following_as_truth_r3|{'train': 100459, 'validation': 1200, 'test': 1200}
134
+ anli_take_the_following_as_truth_r3_score_eval|{'train': 301377, 'validation': 3600, 'test': 3600}
135
+ app_reviews_categorize_rating_using_review|{'train': 288065}
136
+ app_reviews_convert_to_rating|{'train': 288065}
137
+ app_reviews_convert_to_star_rating|{'train': 288065}
138
+ app_reviews_generate_review|{'train': 288065}
139
+ cnn_dailymail_3.0.0_2_or_3_sentences|{'train': 287113, 'validation': 13368, 'test': 11490}
140
+ cnn_dailymail_3.0.0_generate_story|{'train': 287113, 'validation': 13368, 'test': 11490}
141
+ cnn_dailymail_3.0.0_news_card_view|{'train': 287113, 'validation': 13368, 'test': 11490}
142
+ cnn_dailymail_3.0.0_news_stock|{'train': 287113, 'validation': 13368, 'test': 11490}
143
+ cnn_dailymail_3.0.0_news_summary|{'train': 287113, 'validation': 13368, 'test': 11490}
144
+ cnn_dailymail_3.0.0_spice_up_story|{'train': 287113, 'validation': 13368, 'test': 11490}
145
+ cnn_dailymail_3.0.0_sum_in_brief|{'train': 287113, 'validation': 13368, 'test': 11490}
146
+ cnn_dailymail_3.0.0_tldr_summary|{'train': 287113, 'validation': 13368, 'test': 11490}
147
+ cnn_dailymail_3.0.0_write_an_outline|{'train': 287113, 'validation': 13368, 'test': 11490}
148
+ common_gen_Example_prompt|{'train': 67389, 'validation': 4018, 'test': 1497}
149
+ common_gen_Given_concepts_type_1|{'train': 67389, 'validation': 4018, 'test': 1497}
150
+ common_gen_Given_concepts_type_2|{'train': 67389, 'validation': 4018, 'test': 1497}
151
+ common_gen_Put_together|{'train': 67389, 'validation': 4018, 'test': 1497}
152
+ common_gen_choice_in_concept_centric_sentence_generation|{'train': 67389, 'validation': 4018, 'test': 1497}
153
+ common_gen_random_task_template_prompt|{'train': 67389, 'validation': 4018, 'test': 1497}
154
+ common_gen_sentence_to_concepts|{'train': 67389, 'validation': 4018, 'test': 1497}
155
+ common_gen_topic_to_sentence|{'train': 67389, 'validation': 4018, 'test': 1497}
156
+ common_gen_topics_from_the_sentence|{'train': 67389, 'validation': 4018, 'test': 1497}
157
+ cos_e_v1.11_aligned_with_common_sense|{'train': 9741, 'validation': 1221}
158
+ cos_e_v1.11_description_question_option_id|{'train': 9741, 'validation': 1221}
159
+ cos_e_v1.11_description_question_option_text|{'train': 9741, 'validation': 1221}
160
+ cos_e_v1.11_explain_why_human|{'train': 9741, 'validation': 1221}
161
+ cos_e_v1.11_generate_explanation_given_text|{'train': 9741, 'validation': 1221}
162
+ cos_e_v1.11_i_think|{'train': 9741, 'validation': 1221}
163
+ cos_e_v1.11_question_description_option_id|{'train': 9741, 'validation': 1221}
164
+ cos_e_v1.11_question_description_option_text|{'train': 9741, 'validation': 1221}
165
+ cos_e_v1.11_question_option_description_id|{'train': 9741, 'validation': 1221}
166
+ cos_e_v1.11_question_option_description_text|{'train': 9741, 'validation': 1221}
167
+ cos_e_v1.11_rationale|{'train': 9741, 'validation': 1221}
168
+ cosmos_qa_context_answer_to_question|{'train': 25262, 'validation': 2985, 'test': 6963}
169
+ cosmos_qa_context_description_question_answer_id|{'train': 25262, 'validation': 2985, 'test': 6963}
170
+ cosmos_qa_context_description_question_answer_text|{'train': 25262, 'validation': 2985, 'test': 6963}
171
+ cosmos_qa_context_description_question_text|{'train': 25262, 'validation': 2985, 'test': 6963}
172
+ cosmos_qa_context_question_description_answer_id|{'train': 25262, 'validation': 2985, 'test': 6963}
173
+ cosmos_qa_context_question_description_answer_text|{'train': 25262, 'validation': 2985, 'test': 6963}
174
+ cosmos_qa_context_question_description_text|{'train': 25262, 'validation': 2985, 'test': 6963}
175
+ cosmos_qa_description_context_question_answer_id|{'train': 25262, 'validation': 2985, 'test': 6963}
176
+ cosmos_qa_description_context_question_answer_text|{'train': 25262, 'validation': 2985, 'test': 6963}
177
+ cosmos_qa_description_context_question_text|{'train': 25262, 'validation': 2985, 'test': 6963}
178
+ cosmos_qa_no_prompt_id|{'train': 25262, 'validation': 2985, 'test': 6963}
179
+ cosmos_qa_no_prompt_text|{'train': 25262, 'validation': 2985, 'test': 6963}
180
+ cosmos_qa_only_question_answer|{'train': 25262, 'validation': 2985, 'test': 6963}
181
+ dbpedia_14_given_a_choice_of_categories_|{'train': 560000, 'test': 70000}
182
+ dbpedia_14_given_a_list_of_category_what_does_the_title_belong_to|{'train': 560000, 'test': 70000}
183
+ dbpedia_14_given_list_what_category_does_the_paragraph_belong_to|{'train': 560000, 'test': 70000}
184
+ dbpedia_14_pick_one_category_for_the_following_text|{'train': 560000, 'test': 70000}
185
+ dream_answer_to_dialogue|{'train': 6116, 'validation': 2040, 'test': 2041}
186
+ dream_baseline|{'train': 6116, 'validation': 2040, 'test': 2041}
187
+ dream_generate_first_utterance|{'train': 6116, 'validation': 2040, 'test': 2041}
188
+ dream_generate_last_utterance|{'train': 6116, 'validation': 2040, 'test': 2041}
189
+ dream_read_the_following_conversation_and_answer_the_question|{'train': 6116, 'validation': 2040, 'test': 2041}
190
+ duorc_ParaphraseRC_answer_question|{'train': 69524, 'validation': 15591, 'test': 15857}
191
+ duorc_ParaphraseRC_build_story_around_qa|{'train': 58752, 'validation': 13111, 'test': 13449}
192
+ duorc_ParaphraseRC_decide_worth_it|{'train': 69524, 'validation': 15591, 'test': 15857}
193
+ duorc_ParaphraseRC_extract_answer|{'train': 69524, 'validation': 15591, 'test': 15857}
194
+ duorc_ParaphraseRC_generate_question|{'train': 69524, 'validation': 15591, 'test': 15857}
195
+ duorc_ParaphraseRC_generate_question_by_answer|{'train': 58752, 'validation': 13111, 'test': 13449}
196
+ duorc_ParaphraseRC_movie_director|{'train': 69524, 'validation': 15591, 'test': 15857}
197
+ duorc_ParaphraseRC_question_answering|{'train': 69524, 'validation': 15591, 'test': 15857}
198
+ duorc_ParaphraseRC_title_generation|{'train': 69524, 'validation': 15591, 'test': 15857}
199
+ duorc_SelfRC_answer_question|{'train': 60721, 'validation': 12961, 'test': 12559}
200
+ duorc_SelfRC_build_story_around_qa|{'train': 60094, 'validation': 12845, 'test': 12415}
201
+ duorc_SelfRC_decide_worth_it|{'train': 60721, 'validation': 12961, 'test': 12559}
202
+ duorc_SelfRC_extract_answer|{'train': 60721, 'validation': 12961, 'test': 12559}
203
+ duorc_SelfRC_generate_question|{'train': 60721, 'validation': 12961, 'test': 12559}
204
+ duorc_SelfRC_generate_question_by_answer|{'train': 60094, 'validation': 12845, 'test': 12415}
205
+ duorc_SelfRC_movie_director|{'train': 60721, 'validation': 12961, 'test': 12559}
206
+ duorc_SelfRC_question_answering|{'train': 60721, 'validation': 12961, 'test': 12559}
207
+ duorc_SelfRC_title_generation|{'train': 60721, 'validation': 12961, 'test': 12559}
208
+ gigaword_TLDR|{'train': 3803957, 'validation': 189651, 'test': 1951}
209
+ gigaword_first_sentence_title|{'train': 3803957, 'validation': 189651, 'test': 1951}
210
+ gigaword_generate_summary_for_this|{'train': 3803957, 'validation': 189651, 'test': 1951}
211
+ gigaword_in_a_nutshell|{'train': 3803957, 'validation': 189651, 'test': 1951}
212
+ gigaword_make_a_title|{'train': 3803957, 'validation': 189651, 'test': 1951}
213
+ gigaword_reverse_writing|{'train': 3803957, 'validation': 189651, 'test': 1951}
214
+ gigaword_write_a_title_for_this_sentence|{'train': 3803957, 'validation': 189651, 'test': 1951}
215
+ gigaword_write_an_article|{'train': 3803957, 'validation': 189651, 'test': 1951}
216
+ gigaword_write_its_sentence|{'train': 3803957, 'validation': 189651, 'test': 1951}
217
+ glue_mrpc_equivalent|{'train': 3668, 'validation': 408, 'test': 1725}
218
+ glue_mrpc_generate_paraphrase|{'train': 2474, 'validation': 279, 'test': 1147}
219
+ glue_mrpc_generate_sentence|{'train': 2474, 'validation': 279, 'test': 1147}
220
+ glue_mrpc_paraphrase|{'train': 3668, 'validation': 408, 'test': 1725}
221
+ glue_mrpc_replace|{'train': 3668, 'validation': 408, 'test': 1725}
222
+ glue_mrpc_same_thing|{'train': 3668, 'validation': 408, 'test': 1725}
223
+ glue_mrpc_want_to_know|{'train': 3668, 'validation': 408, 'test': 1725}
224
+ glue_qqp_answer|{'train': 363846, 'validation': 40430, 'test': 390965}
225
+ glue_qqp_duplicate|{'train': 363846, 'validation': 40430, 'test': 390965}
226
+ glue_qqp_duplicate_or_not|{'train': 363846, 'validation': 40430, 'test': 390965}
227
+ glue_qqp_meaning|{'train': 363846, 'validation': 40430, 'test': 390965}
228
+ glue_qqp_quora|{'train': 363846, 'validation': 40430, 'test': 390965}
229
+ glue_qqp_same_thing|{'train': 363846, 'validation': 40430, 'test': 390965}
230
+ hellaswag_Appropriate_continuation_Yes_or_No|{'train': 39905, 'validation': 10042, 'test': 10003}
231
+ hellaswag_Open_ended_completion|{'train': 39905, 'validation': 10042, 'test': 10003}
232
+ hellaswag_Open_ended_start|{'train': 39905, 'validation': 10042, 'test': 10003}
233
+ hellaswag_Predict_ending_with_hint|{'train': 39905, 'validation': 10042, 'test': 10003}
234
+ hellaswag_Predict_ending_with_hint_score_eval|{'train': 159620, 'validation': 40168, 'test': 40012}
235
+ hellaswag_Randomized_prompts_template|{'train': 39905, 'validation': 10042, 'test': 10003}
236
+ hellaswag_Randomized_prompts_template_score_eval|{'train': 159620, 'validation': 40168, 'test': 40012}
237
+ hellaswag_Reversed_appropriate_continuation_Yes_or_No|{'train': 39905, 'validation': 10042, 'test': 10003}
238
+ hellaswag_Topic_of_the_context|{'train': 39905, 'validation': 10042, 'test': 10003}
239
+ hellaswag_Topic_without_the_ending_answer|{'train': 39905, 'validation': 10042, 'test': 10003}
240
+ hellaswag_complete_first_then|{'train': 39905, 'validation': 10042, 'test': 10003}
241
+ hellaswag_complete_first_then_score_eval|{'train': 159620, 'validation': 40168, 'test': 40012}
242
+ hellaswag_how_ends|{'train': 39905, 'validation': 10042, 'test': 10003}
243
+ hellaswag_if_begins_how_continues|{'train': 39905, 'validation': 10042, 'test': 10003}
244
+ hellaswag_if_begins_how_continues_score_eval|{'train': 159620, 'validation': 40168, 'test': 40012}
245
+ imdb_Movie_Expressed_Sentiment|{'train': 25000, 'test': 25000, 'unsupervised': 50000}
246
+ imdb_Movie_Expressed_Sentiment_2|{'train': 25000, 'test': 25000, 'unsupervised': 50000}
247
+ imdb_Negation_template_for_positive_and_negative|{'train': 25000, 'test': 25000, 'unsupervised': 50000}
248
+ imdb_Reviewer_Enjoyment|{'train': 25000, 'test': 25000, 'unsupervised': 50000}
249
+ imdb_Reviewer_Enjoyment_Yes_No|{'train': 25000, 'test': 25000, 'unsupervised': 50000}
250
+ imdb_Reviewer_Expressed_Sentiment|{'train': 25000, 'test': 25000, 'unsupervised': 50000}
251
+ imdb_Reviewer_Opinion_bad_good_choices|{'train': 25000, 'test': 25000, 'unsupervised': 50000}
252
+ imdb_Reviewer_Sentiment_Feeling|{'train': 25000, 'test': 25000, 'unsupervised': 50000}
253
+ imdb_Sentiment_with_choices_|{'train': 25000, 'test': 25000, 'unsupervised': 50000}
254
+ imdb_Text_Expressed_Sentiment|{'train': 25000, 'test': 25000, 'unsupervised': 50000}
255
+ imdb_Writer_Expressed_Sentiment|{'train': 25000, 'test': 25000, 'unsupervised': 50000}
256
+ kilt_tasks_hotpotqa_combining_facts|{'train': 88869, 'validation': 5600}
257
+ kilt_tasks_hotpotqa_complex_question|{'train': 88869, 'validation': 5600}
258
+ kilt_tasks_hotpotqa_final_exam|{'train': 88869, 'validation': 5600}
259
+ kilt_tasks_hotpotqa_formulate|{'train': 88869, 'validation': 5600}
260
+ kilt_tasks_hotpotqa_straighforward_qa|{'train': 88869, 'validation': 5600}
261
+ multi_news_distill|{'train': 44972, 'validation': 5622, 'test': 5622}
262
+ multi_news_expand_reverse_task_|{'train': 44972, 'validation': 5622, 'test': 5622}
263
+ multi_news_summarize|{'train': 44972, 'validation': 5622, 'test': 5622}
264
+ multi_news_summary_scenario|{'train': 44972, 'validation': 5622, 'test': 5622}
265
+ multi_news_synthesize|{'train': 44972, 'validation': 5622, 'test': 5622}
266
+ multi_news_what_are_the_key_points|{'train': 44972, 'validation': 5622, 'test': 5622}
267
+ openbookqa_main_choices|{'train': 4957, 'validation': 500, 'test': 500}
268
+ openbookqa_main_choose_an_answer_with_options|{'train': 4957, 'validation': 500, 'test': 500}
269
+ openbookqa_main_only_options|{'train': 4957, 'validation': 500, 'test': 500}
270
+ openbookqa_main_pick_answer_with_options|{'train': 4957, 'validation': 500, 'test': 500}
271
+ openbookqa_main_pick_using_id|{'train': 4957, 'validation': 500, 'test': 500}
272
+ openbookqa_main_which_correct|{'train': 4957, 'validation': 500, 'test': 500}
273
+ openbookqa_main_which_correct_inverse|{'train': 4957, 'validation': 500, 'test': 500}
274
+ paws_labeled_final_Concatenation|{'train': 49401, 'validation': 8000, 'test': 8000}
275
+ paws_labeled_final_Concatenation_no_label|{'train': 49401, 'validation': 8000, 'test': 8000}
276
+ paws_labeled_final_Meaning|{'train': 49401, 'validation': 8000, 'test': 8000}
277
+ paws_labeled_final_Meaning_no_label|{'train': 49401, 'validation': 8000, 'test': 8000}
278
+ paws_labeled_final_PAWS_ANLI_GPT3|{'train': 49401, 'validation': 8000, 'test': 8000}
279
+ paws_labeled_final_PAWS_ANLI_GPT3_no_label|{'train': 49401, 'validation': 8000, 'test': 8000}
280
+ paws_labeled_final_Rewrite|{'train': 49401, 'validation': 8000, 'test': 8000}
281
+ paws_labeled_final_Rewrite_no_label|{'train': 49401, 'validation': 8000, 'test': 8000}
282
+ paws_labeled_final_context_question|{'train': 49401, 'validation': 8000, 'test': 8000}
283
+ paws_labeled_final_context_question_no_label|{'train': 49401, 'validation': 8000, 'test': 8000}
284
+ paws_labeled_final_paraphrase_task|{'train': 21829, 'validation': 3539, 'test': 3536}
285
+ paws_labeled_final_task_description_no_label|{'train': 49401, 'validation': 8000, 'test': 8000}
286
+ piqa_Correct_the_solution|{'train': 16113, 'validation': 1838, 'test': 3084}
287
+ piqa_Correct_the_solution_if_false_from_sol_1|{'train': 16113, 'validation': 1838, 'test': 3084}
288
+ piqa_Correct_the_solution_if_false_from_sol_2|{'train': 16113, 'validation': 1838, 'test': 3084}
289
+ piqa_Does_this_solution_make_sense_sol1|{'train': 16113, 'validation': 1838, 'test': 3084}
290
+ piqa_Does_this_solution_make_sense_sol2|{'train': 16113, 'validation': 1838, 'test': 3084}
291
+ piqa_choose_the_most_appropriate_solution|{'train': 16113, 'validation': 1838, 'test': 3084}
292
+ piqa_finish_sentence_with_correct_choice|{'train': 16113, 'validation': 1838, 'test': 3084}
293
+ piqa_no_prompt_needed|{'train': 16113, 'validation': 1838, 'test': 3084}
294
+ piqa_pick_correct_choice_index|{'train': 16113, 'validation': 1838, 'test': 3084}
295
+ piqa_pick_correct_choice_with_choice_given_before_goal|{'train': 16113, 'validation': 1838, 'test': 3084}
296
+ piqa_what_is_the_correct_ending|{'train': 16113, 'validation': 1838, 'test': 3084}
297
+ qasc_is_correct_1|{'train': 8134, 'validation': 926, 'test': 920}
298
+ qasc_is_correct_2|{'train': 8134, 'validation': 926, 'test': 920}
299
+ qasc_qa_with_combined_facts_1|{'train': 8134, 'validation': 926, 'test': 920}
300
+ qasc_qa_with_separated_facts_1|{'train': 8134, 'validation': 926, 'test': 920}
301
+ qasc_qa_with_separated_facts_2|{'train': 8134, 'validation': 926, 'test': 920}
302
+ qasc_qa_with_separated_facts_3|{'train': 8134, 'validation': 926, 'test': 920}
303
+ qasc_qa_with_separated_facts_4|{'train': 8134, 'validation': 926, 'test': 920}
304
+ qasc_qa_with_separated_facts_5|{'train': 8134, 'validation': 926, 'test': 920}
305
+ quail_context_description_question_answer_id|{'train': 10246, 'validation': 2164, 'challenge': 556}
306
+ quail_context_description_question_answer_text|{'train': 10246, 'validation': 2164, 'challenge': 556}
307
+ quail_context_description_question_text|{'train': 10246, 'validation': 2164, 'challenge': 556}
308
+ quail_context_question_answer_description_id|{'train': 10246, 'validation': 2164, 'challenge': 556}
309
+ quail_context_question_answer_description_text|{'train': 10246, 'validation': 2164, 'challenge': 556}
310
+ quail_context_question_description_answer_id|{'train': 10246, 'validation': 2164, 'challenge': 556}
311
+ quail_context_question_description_answer_text|{'train': 10246, 'validation': 2164, 'challenge': 556}
312
+ quail_context_question_description_text|{'train': 10246, 'validation': 2164, 'challenge': 556}
313
+ quail_description_context_question_answer_id|{'train': 10246, 'validation': 2164, 'challenge': 556}
314
+ quail_description_context_question_answer_text|{'train': 10246, 'validation': 2164, 'challenge': 556}
315
+ quail_description_context_question_text|{'train': 10246, 'validation': 2164, 'challenge': 556}
316
+ quail_no_prompt_id|{'train': 10246, 'validation': 2164, 'challenge': 556}
317
+ quail_no_prompt_text|{'train': 10246, 'validation': 2164, 'challenge': 556}
318
+ quarel_choose_between|{'train': 1941, 'validation': 278, 'test': 552}
319
+ quarel_do_not_use|{'train': 1941, 'validation': 278, 'test': 552}
320
+ quarel_heres_a_story|{'train': 1941, 'validation': 278, 'test': 552}
321
+ quarel_logic_test|{'train': 1941, 'validation': 278, 'test': 552}
322
+ quarel_testing_students|{'train': 1941, 'validation': 278, 'test': 552}
323
+ quartz_answer_question_based_on|{'train': 2696, 'validation': 384, 'test': 784}
324
+ quartz_answer_question_below|{'train': 2696, 'validation': 384, 'test': 784}
325
+ quartz_given_the_fact_answer_the_q|{'train': 2696, 'validation': 384, 'test': 784}
326
+ quartz_having_read_above_passage|{'train': 2696, 'validation': 384, 'test': 784}
327
+ quartz_paragraph_question_plain_concat|{'train': 2696, 'validation': 384, 'test': 784}
328
+ quartz_read_passage_below_choose|{'train': 2696, 'validation': 384, 'test': 784}
329
+ quartz_use_info_from_paragraph_question|{'train': 2696, 'validation': 384, 'test': 784}
330
+ quartz_use_info_from_question_paragraph|{'train': 2696, 'validation': 384, 'test': 784}
331
+ quoref_Answer_Friend_Question|{'train': 19399, 'validation': 2418}
332
+ quoref_Answer_Question_Given_Context|{'train': 19399, 'validation': 2418}
333
+ quoref_Answer_Test|{'train': 19399, 'validation': 2418}
334
+ quoref_Context_Contains_Answer|{'train': 19399, 'validation': 2418}
335
+ quoref_Find_Answer|{'train': 19399, 'validation': 2418}
336
+ quoref_Found_Context_Online|{'train': 19399, 'validation': 2418}
337
+ quoref_Given_Context_Answer_Question|{'train': 19399, 'validation': 2418}
338
+ quoref_Guess_Answer|{'train': 19399, 'validation': 2418}
339
+ quoref_Guess_Title_For_Context|{'train': 19399, 'validation': 2418}
340
+ quoref_Read_And_Extract_|{'train': 19399, 'validation': 2418}
341
+ quoref_What_Is_The_Answer|{'train': 19399, 'validation': 2418}
342
+ race_high_Is_this_the_right_answer|{'train': 62445, 'validation': 3451, 'test': 3498}
343
+ race_high_Read_the_article_and_answer_the_question_no_option_|{'train': 62445, 'validation': 3451, 'test': 3498}
344
+ race_high_Select_the_best_answer|{'train': 62445, 'validation': 3451, 'test': 3498}
345
+ race_high_Select_the_best_answer_generate_span_|{'train': 62445, 'validation': 3451, 'test': 3498}
346
+ race_high_Select_the_best_answer_no_instructions_|{'train': 62445, 'validation': 3451, 'test': 3498}
347
+ race_high_Taking_a_test|{'train': 62445, 'validation': 3451, 'test': 3498}
348
+ race_high_Write_a_multi_choice_question_for_the_following_article|{'train': 62445, 'validation': 3451, 'test': 3498}
349
+ race_high_Write_a_multi_choice_question_options_given_|{'train': 62445, 'validation': 3451, 'test': 3498}
350
+ race_middle_Is_this_the_right_answer|{'train': 25421, 'validation': 1436, 'test': 1436}
351
+ race_middle_Read_the_article_and_answer_the_question_no_option_|{'train': 25421, 'validation': 1436, 'test': 1436}
352
+ race_middle_Select_the_best_answer|{'train': 25421, 'validation': 1436, 'test': 1436}
353
+ race_middle_Select_the_best_answer_generate_span_|{'train': 25421, 'validation': 1436, 'test': 1436}
354
+ race_middle_Select_the_best_answer_no_instructions_|{'train': 25421, 'validation': 1436, 'test': 1436}
355
+ race_middle_Taking_a_test|{'train': 25421, 'validation': 1436, 'test': 1436}
356
+ race_middle_Write_a_multi_choice_question_for_the_following_article|{'train': 25421, 'validation': 1436, 'test': 1436}
357
+ race_middle_Write_a_multi_choice_question_options_given_|{'train': 25421, 'validation': 1436, 'test': 1436}
358
+ ropes_background_new_situation_answer|{'train': 10924, 'validation': 1688}
359
+ ropes_background_situation_middle|{'train': 10924, 'validation': 1688}
360
+ ropes_given_background_situation|{'train': 10924, 'validation': 1688}
361
+ ropes_new_situation_background_answer|{'train': 10924, 'validation': 1688}
362
+ ropes_plain_background_situation|{'train': 10924, 'validation': 1688}
363
+ ropes_plain_bottom_hint|{'train': 10924, 'validation': 1688}
364
+ ropes_plain_no_background|{'train': 10924, 'validation': 1688}
365
+ ropes_prompt_beginning|{'train': 10924, 'validation': 1688}
366
+ ropes_prompt_bottom_hint_beginning|{'train': 10924, 'validation': 1688}
367
+ ropes_prompt_bottom_no_hint|{'train': 10924, 'validation': 1688}
368
+ ropes_prompt_mix|{'train': 10924, 'validation': 1688}
369
+ ropes_read_background_situation|{'train': 10924, 'validation': 1688}
370
+ rotten_tomatoes_Movie_Expressed_Sentiment|{'train': 8530, 'validation': 1066, 'test': 1066}
371
+ rotten_tomatoes_Movie_Expressed_Sentiment_2|{'train': 8530, 'validation': 1066, 'test': 1066}
372
+ rotten_tomatoes_Reviewer_Enjoyment|{'train': 8530, 'validation': 1066, 'test': 1066}
373
+ rotten_tomatoes_Reviewer_Enjoyment_Yes_No|{'train': 8530, 'validation': 1066, 'test': 1066}
374
+ rotten_tomatoes_Reviewer_Expressed_Sentiment|{'train': 8530, 'validation': 1066, 'test': 1066}
375
+ rotten_tomatoes_Reviewer_Opinion_bad_good_choices|{'train': 8530, 'validation': 1066, 'test': 1066}
376
+ rotten_tomatoes_Reviewer_Sentiment_Feeling|{'train': 8530, 'validation': 1066, 'test': 1066}
377
+ rotten_tomatoes_Sentiment_with_choices_|{'train': 8530, 'validation': 1066, 'test': 1066}
378
+ rotten_tomatoes_Text_Expressed_Sentiment|{'train': 8530, 'validation': 1066, 'test': 1066}
379
+ rotten_tomatoes_Writer_Expressed_Sentiment|{'train': 8530, 'validation': 1066, 'test': 1066}
380
+ samsum_Generate_a_summary_for_this_dialogue|{'train': 14732, 'validation': 818, 'test': 819}
381
+ samsum_Given_the_above_dialogue_write_a_summary|{'train': 14732, 'validation': 818, 'test': 819}
382
+ samsum_Sum_up_the_following_dialogue|{'train': 14732, 'validation': 818, 'test': 819}
383
+ samsum_Summarize_|{'train': 14732, 'validation': 818, 'test': 819}
384
+ samsum_Summarize_this_dialogue_|{'train': 14732, 'validation': 818, 'test': 819}
385
+ samsum_To_sum_up_this_dialog|{'train': 14732, 'validation': 818, 'test': 819}
386
+ samsum_Write_a_dialogue_that_match_this_summary|{'train': 14732, 'validation': 818, 'test': 819}
387
+ sciq_Direct_Question|{'train': 11679, 'validation': 1000, 'test': 1000}
388
+ sciq_Direct_Question_Closed_Book_|{'train': 11679, 'validation': 1000, 'test': 1000}
389
+ sciq_Multiple_Choice|{'train': 11679, 'validation': 1000, 'test': 1000}
390
+ sciq_Multiple_Choice_Closed_Book_|{'train': 11679, 'validation': 1000, 'test': 1000}
391
+ sciq_Multiple_Choice_Question_First|{'train': 11679, 'validation': 1000, 'test': 1000}
392
+ social_i_qa_Check_if_a_random_answer_is_valid_or_not|{'train': 33410, 'validation': 1954}
393
+ social_i_qa_Generate_answer|{'train': 33410, 'validation': 1954}
394
+ social_i_qa_Generate_the_question_from_the_answer|{'train': 33410, 'validation': 1954}
395
+ social_i_qa_I_was_wondering|{'train': 33410, 'validation': 1954}
396
+ social_i_qa_Show_choices_and_generate_answer|{'train': 33410, 'validation': 1954}
397
+ social_i_qa_Show_choices_and_generate_index|{'train': 33410, 'validation': 1954}
398
+ squad_v2_Jeopardy_with_Context|{'train': 86821, 'validation': 5928}
399
+ squad_v2_Jeopardy_without_Context|{'train': 86821, 'validation': 5928}
400
+ squad_v2_Questions_with_Context|{'train': 130319, 'validation': 11873}
401
+ squad_v2_Questions_with_Context_Without_Prompt_Keywords|{'train': 130319, 'validation': 11873}
402
+ squad_v2_Questions_with_Context_Without_Prompt_Keywords_unanswerable|{'train': 130319, 'validation': 11873}
403
+ squad_v2_Questions_with_Context_unanswerable|{'train': 130319, 'validation': 11873}
404
+ squad_v2_Topic_Prediction_Context|{'train': 130319, 'validation': 11873}
405
+ squad_v2_Topic_Prediction_Context_with_randomized_prompt_options|{'train': 130319, 'validation': 11873}
406
+ squad_v2_Topic_Prediction_Context_with_randomized_prompt_options_placed_in_the_end|{'train': 130319, 'validation': 11873}
407
+ squad_v2_Topic_Prediction_Question_and_Answer_Pair|{'train': 86821, 'validation': 5928}
408
+ squad_v2_Trivia|{'train': 86821, 'validation': 5928}
409
+ squad_v2_Unanwerable_question|{'train': 130319, 'validation': 11873}
410
+ super_glue_boolq_GPT_3_Style|{'train': 9427, 'validation': 3270, 'test': 3245}
411
+ super_glue_boolq_I_wonder_|{'train': 9427, 'validation': 3270, 'test': 3245}
412
+ super_glue_boolq_after_reading|{'train': 9427, 'validation': 3270, 'test': 3245}
413
+ super_glue_boolq_based_on_the_following_passage|{'train': 9427, 'validation': 3270, 'test': 3245}
414
+ super_glue_boolq_based_on_the_previous_passage|{'train': 9427, 'validation': 3270, 'test': 3245}
415
+ super_glue_boolq_could_you_tell_me_|{'train': 9427, 'validation': 3270, 'test': 3245}
416
+ super_glue_boolq_exam|{'train': 9427, 'validation': 3270, 'test': 3245}
417
+ super_glue_boolq_exercise|{'train': 9427, 'validation': 3270, 'test': 3245}
418
+ super_glue_boolq_valid_binary|{'train': 9427, 'validation': 3270, 'test': 3245}
419
+ super_glue_boolq_yes_no_question|{'train': 9427, 'validation': 3270, 'test': 3245}
420
+ super_glue_cb_GPT_3_style|{'train': 250, 'validation': 56, 'test': 250}
421
+ super_glue_cb_GPT_3_style_score_eval|{'train': 750, 'validation': 168, 'test': 750}
422
+ super_glue_cb_MNLI_crowdsource|{'train': 250, 'validation': 56, 'test': 250}
423
+ super_glue_cb_MNLI_crowdsource_score_eval|{'train': 750, 'validation': 168, 'test': 750}
424
+ super_glue_cb_always_sometimes_never|{'train': 250, 'validation': 56, 'test': 250}
425
+ super_glue_cb_always_sometimes_never_score_eval|{'train': 750, 'validation': 168, 'test': 750}
426
+ super_glue_cb_based_on_the_previous_passage|{'train': 250, 'validation': 56, 'test': 250}
427
+ super_glue_cb_based_on_the_previous_passage_score_eval|{'train': 750, 'validation': 168, 'test': 750}
428
+ super_glue_cb_can_we_infer|{'train': 250, 'validation': 56, 'test': 250}
429
+ super_glue_cb_can_we_infer_score_eval|{'train': 750, 'validation': 168, 'test': 750}
430
+ super_glue_cb_claim_true_false_inconclusive|{'train': 250, 'validation': 56, 'test': 250}
431
+ super_glue_cb_claim_true_false_inconclusive_score_eval|{'train': 750, 'validation': 168, 'test': 750}
432
+ super_glue_cb_consider_always_sometimes_never|{'train': 250, 'validation': 56, 'test': 250}
433
+ super_glue_cb_consider_always_sometimes_never_score_eval|{'train': 750, 'validation': 168, 'test': 750}
434
+ super_glue_cb_does_it_follow_that|{'train': 250, 'validation': 56, 'test': 250}
435
+ super_glue_cb_does_it_follow_that_score_eval|{'train': 750, 'validation': 168, 'test': 750}
436
+ super_glue_cb_does_this_imply|{'train': 250, 'validation': 56, 'test': 250}
437
+ super_glue_cb_does_this_imply_score_eval|{'train': 750, 'validation': 168, 'test': 750}
438
+ super_glue_cb_guaranteed_possible_impossible|{'train': 250, 'validation': 56, 'test': 250}
439
+ super_glue_cb_guaranteed_possible_impossible_score_eval|{'train': 750, 'validation': 168, 'test': 750}
440
+ super_glue_cb_guaranteed_true|{'train': 250, 'validation': 56, 'test': 250}
441
+ super_glue_cb_guaranteed_true_score_eval|{'train': 750, 'validation': 168, 'test': 750}
442
+ super_glue_cb_justified_in_saying|{'train': 250, 'validation': 56, 'test': 250}
443
+ super_glue_cb_justified_in_saying_score_eval|{'train': 750, 'validation': 168, 'test': 750}
444
+ super_glue_cb_must_be_true|{'train': 250, 'validation': 56, 'test': 250}
445
+ super_glue_cb_must_be_true_score_eval|{'train': 750, 'validation': 168, 'test': 750}
446
+ super_glue_cb_should_assume|{'train': 250, 'validation': 56, 'test': 250}
447
+ super_glue_cb_should_assume_score_eval|{'train': 750, 'validation': 168, 'test': 750}
448
+ super_glue_cb_take_the_following_as_truth|{'train': 250, 'validation': 56, 'test': 250}
449
+ super_glue_cb_take_the_following_as_truth_score_eval|{'train': 750, 'validation': 168, 'test': 750}
450
+ super_glue_copa_C1_or_C2_premise_so_because_|{'train': 400, 'validation': 100, 'test': 500}
451
+ super_glue_copa_C1_or_C2_premise_so_because__score_eval|{'train': 800, 'validation': 200, 'test': 1000}
452
+ super_glue_copa__As_a_result_C1_or_C2_|{'train': 202, 'validation': 48, 'test': 250}
453
+ super_glue_copa__As_a_result_C1_or_C2__score_eval|{'train': 404, 'validation': 96, 'test': 500}
454
+ super_glue_copa__What_could_happen_next_C1_or_C2_|{'train': 202, 'validation': 48, 'test': 250}
455
+ super_glue_copa__What_could_happen_next_C1_or_C2__score_eval|{'train': 404, 'validation': 96, 'test': 500}
456
+ super_glue_copa__which_may_be_caused_by|{'train': 198, 'validation': 52, 'test': 250}
457
+ super_glue_copa__which_may_be_caused_by_score_eval|{'train': 396, 'validation': 104, 'test': 500}
458
+ super_glue_copa__why_C1_or_C2|{'train': 198, 'validation': 52, 'test': 250}
459
+ super_glue_copa__why_C1_or_C2_score_eval|{'train': 396, 'validation': 104, 'test': 500}
460
+ super_glue_copa_best_option|{'train': 400, 'validation': 100, 'test': 500}
461
+ super_glue_copa_best_option_score_eval|{'train': 800, 'validation': 200, 'test': 1000}
462
+ super_glue_copa_cause_effect|{'train': 400, 'validation': 100, 'test': 500}
463
+ super_glue_copa_cause_effect_score_eval|{'train': 800, 'validation': 200, 'test': 1000}
464
+ super_glue_copa_choose|{'train': 400, 'validation': 100, 'test': 500}
465
+ super_glue_copa_choose_score_eval|{'train': 800, 'validation': 200, 'test': 1000}
466
+ super_glue_copa_exercise|{'train': 400, 'validation': 100, 'test': 500}
467
+ super_glue_copa_exercise_score_eval|{'train': 800, 'validation': 200, 'test': 1000}
468
+ super_glue_copa_i_am_hesitating|{'train': 400, 'validation': 100, 'test': 500}
469
+ super_glue_copa_i_am_hesitating_score_eval|{'train': 800, 'validation': 200, 'test': 1000}
470
+ super_glue_copa_more_likely|{'train': 400, 'validation': 100, 'test': 500}
471
+ super_glue_copa_more_likely_score_eval|{'train': 800, 'validation': 200, 'test': 1000}
472
+ super_glue_copa_plausible_alternatives|{'train': 400, 'validation': 100, 'test': 500}
473
+ super_glue_copa_plausible_alternatives_score_eval|{'train': 800, 'validation': 200, 'test': 1000}
474
+ super_glue_multirc_I_was_going_to_say_|{'train': 27243, 'validation': 4848, 'test': 9693}
475
+ super_glue_multirc_Would_it_be_good_to_answer_|{'train': 27243, 'validation': 4848, 'test': 9693}
476
+ super_glue_multirc_confirm|{'train': 27243, 'validation': 4848, 'test': 9693}
477
+ super_glue_multirc_correct|{'train': 27243, 'validation': 4848, 'test': 9693}
478
+ super_glue_multirc_decide_valid|{'train': 27243, 'validation': 4848, 'test': 9693}
479
+ super_glue_multirc_found_this_answer|{'train': 27243, 'validation': 4848, 'test': 9693}
480
+ super_glue_multirc_grading|{'train': 27243, 'validation': 4848, 'test': 9693}
481
+ super_glue_multirc_is_a_correct_answer_|{'train': 27243, 'validation': 4848, 'test': 9693}
482
+ super_glue_multirc_is_the_correct_answer_|{'train': 27243, 'validation': 4848, 'test': 9693}
483
+ super_glue_multirc_paragraph_question_is_it_|{'train': 27243, 'validation': 4848, 'test': 9693}
484
+ super_glue_record_Add_sentence_after_after_continuation_choices_|{'train': 100730, 'validation': 10000, 'test': 10000}
485
+ super_glue_record_Add_sentence_after_continuation_choices_|{'train': 100730, 'validation': 10000, 'test': 10000}
486
+ super_glue_record_Can_you_figure_out_|{'train': 100730, 'validation': 10000, 'test': 10000}
487
+ super_glue_record_GPT_3_style_continuation_choices_|{'train': 100730, 'validation': 10000, 'test': 10000}
488
+ super_glue_record_GPT_3_style_summary_only_continuation_choices_|{'train': 100730, 'validation': 10000, 'test': 10000}
489
+ super_glue_record_GPT_3_style_with_labels_continuation_choices_|{'train': 100730, 'validation': 10000, 'test': 10000}
490
+ super_glue_record_GPT_3_style_with_labels_without_hyphens_continuation_choices_|{'train': 100730, 'validation': 10000, 'test': 10000}
491
+ super_glue_record_GPT_3_style_without_hyphens_continuation_choices_|{'train': 100730, 'validation': 10000, 'test': 10000}
492
+ super_glue_record_In_the_question_above_the_placeholder_stands_for|{'train': 100730, 'validation': 10000, 'test': 10000}
493
+ super_glue_record_New_highlight_continuation_choices_|{'train': 100730, 'validation': 10000, 'test': 10000}
494
+ super_glue_record_News_article_continuation_choices_|{'train': 100730, 'validation': 10000, 'test': 10000}
495
+ super_glue_record_Summary_first_continuation_choices_|{'train': 100730, 'validation': 10000, 'test': 10000}
496
+ super_glue_record_What_could_the_placeholder_be_|{'train': 100730, 'validation': 10000, 'test': 10000}
497
+ super_glue_record_Which_one_is_the_placeholder_|{'train': 100730, 'validation': 10000, 'test': 10000}
498
+ super_glue_record_choose_between|{'train': 100730, 'validation': 10000, 'test': 10000}
499
+ super_glue_record_corrupted|{'train': 100730, 'validation': 10000, 'test': 10000}
500
+ super_glue_record_exercise|{'train': 100730, 'validation': 10000, 'test': 10000}
501
+ super_glue_record_pick_one_option|{'train': 100730, 'validation': 10000, 'test': 10000}
502
+ super_glue_record_the_placeholder_refers_to_|{'train': 100730, 'validation': 10000, 'test': 10000}
503
+ super_glue_record_trying_to_decide|{'train': 100730, 'validation': 10000, 'test': 10000}
504
+ super_glue_rte_GPT_3_style|{'train': 2490, 'validation': 277, 'test': 3000}
505
+ super_glue_rte_GPT_3_style_score_eval|{'train': 4980, 'validation': 554, 'test': 6000}
506
+ super_glue_rte_MNLI_crowdsource|{'train': 2490, 'validation': 277, 'test': 3000}
507
+ super_glue_rte_MNLI_crowdsource_score_eval|{'train': 4980, 'validation': 554, 'test': 6000}
508
+ super_glue_rte_based_on_the_previous_passage|{'train': 2490, 'validation': 277, 'test': 3000}
509
+ super_glue_rte_based_on_the_previous_passage_score_eval|{'train': 4980, 'validation': 554, 'test': 6000}
510
+ super_glue_rte_can_we_infer|{'train': 2490, 'validation': 277, 'test': 3000}
511
+ super_glue_rte_can_we_infer_score_eval|{'train': 4980, 'validation': 554, 'test': 6000}
512
+ super_glue_rte_does_it_follow_that|{'train': 2490, 'validation': 277, 'test': 3000}
513
+ super_glue_rte_does_it_follow_that_score_eval|{'train': 4980, 'validation': 554, 'test': 6000}
514
+ super_glue_rte_does_this_imply|{'train': 2490, 'validation': 277, 'test': 3000}
515
+ super_glue_rte_does_this_imply_score_eval|{'train': 4980, 'validation': 554, 'test': 6000}
516
+ super_glue_rte_guaranteed_true|{'train': 2490, 'validation': 277, 'test': 3000}
517
+ super_glue_rte_guaranteed_true_score_eval|{'train': 4980, 'validation': 554, 'test': 6000}
518
+ super_glue_rte_justified_in_saying|{'train': 2490, 'validation': 277, 'test': 3000}
519
+ super_glue_rte_justified_in_saying_score_eval|{'train': 4980, 'validation': 554, 'test': 6000}
520
+ super_glue_rte_must_be_true|{'train': 2490, 'validation': 277, 'test': 3000}
521
+ super_glue_rte_must_be_true_score_eval|{'train': 4980, 'validation': 554, 'test': 6000}
522
+ super_glue_rte_should_assume|{'train': 2490, 'validation': 277, 'test': 3000}
523
+ super_glue_rte_should_assume_score_eval|{'train': 4980, 'validation': 554, 'test': 6000}
524
+ super_glue_wic_GPT_3_prompt|{'train': 5428, 'validation': 638, 'test': 1400}
525
+ super_glue_wic_GPT_3_prompt_score_eval|{'train': 10856, 'validation': 1276, 'test': 2800}
526
+ super_glue_wic_GPT_3_prompt_with_label|{'train': 5428, 'validation': 638, 'test': 1400}
527
+ super_glue_wic_GPT_3_prompt_with_label_score_eval|{'train': 10856, 'validation': 1276, 'test': 2800}
528
+ super_glue_wic_affirmation_true_or_false|{'train': 5428, 'validation': 638, 'test': 1400}
529
+ super_glue_wic_affirmation_true_or_false_score_eval|{'train': 10856, 'validation': 1276, 'test': 2800}
530
+ super_glue_wic_grammar_homework|{'train': 5428, 'validation': 638, 'test': 1400}
531
+ super_glue_wic_grammar_homework_score_eval|{'train': 10856, 'validation': 1276, 'test': 2800}
532
+ super_glue_wic_polysemous|{'train': 5428, 'validation': 638, 'test': 1400}
533
+ super_glue_wic_polysemous_score_eval|{'train': 10856, 'validation': 1276, 'test': 2800}
534
+ super_glue_wic_question_context|{'train': 5428, 'validation': 638, 'test': 1400}
535
+ super_glue_wic_question_context_meaning|{'train': 5428, 'validation': 638, 'test': 1400}
536
+ super_glue_wic_question_context_meaning_score_eval|{'train': 10856, 'validation': 1276, 'test': 2800}
537
+ super_glue_wic_question_context_meaning_with_label|{'train': 5428, 'validation': 638, 'test': 1400}
538
+ super_glue_wic_question_context_meaning_with_label_score_eval|{'train': 10856, 'validation': 1276, 'test': 2800}
539
+ super_glue_wic_question_context_score_eval|{'train': 10856, 'validation': 1276, 'test': 2800}
540
+ super_glue_wic_same_sense|{'train': 5428, 'validation': 638, 'test': 1400}
541
+ super_glue_wic_same_sense_score_eval|{'train': 10856, 'validation': 1276, 'test': 2800}
542
+ super_glue_wic_similar_sense|{'train': 5428, 'validation': 638, 'test': 1400}
543
+ super_glue_wic_similar_sense_score_eval|{'train': 10856, 'validation': 1276, 'test': 2800}
544
+ super_glue_wsc.fixed_GPT_3_Style|{'train': 554, 'validation': 104, 'test': 146}
545
+ super_glue_wsc.fixed_GPT_3_Style_score_eval|{'train': 1108, 'validation': 208, 'test': 292}
546
+ super_glue_wsc.fixed_I_think_they_mean|{'train': 554, 'validation': 104, 'test': 146}
547
+ super_glue_wsc.fixed_I_think_they_mean_score_eval|{'train': 1108, 'validation': 208, 'test': 292}
548
+ super_glue_wsc.fixed_Who_or_what_is_are|{'train': 554, 'validation': 104, 'test': 146}
549
+ super_glue_wsc.fixed_Who_or_what_is_are_score_eval|{'train': 1108, 'validation': 208, 'test': 292}
550
+ super_glue_wsc.fixed_by_p_they_mean|{'train': 554, 'validation': 104, 'test': 146}
551
+ super_glue_wsc.fixed_by_p_they_mean_score_eval|{'train': 1108, 'validation': 208, 'test': 292}
552
+ super_glue_wsc.fixed_does_p_stand_for|{'train': 554, 'validation': 104, 'test': 146}
553
+ super_glue_wsc.fixed_does_p_stand_for_score_eval|{'train': 1108, 'validation': 208, 'test': 292}
554
+ super_glue_wsc.fixed_does_the_pronoun_refer_to|{'train': 554, 'validation': 104, 'test': 146}
555
+ super_glue_wsc.fixed_does_the_pronoun_refer_to_score_eval|{'train': 1108, 'validation': 208, 'test': 292}
556
+ super_glue_wsc.fixed_in_other_words|{'train': 554, 'validation': 104, 'test': 146}
557
+ super_glue_wsc.fixed_in_other_words_score_eval|{'train': 1108, 'validation': 208, 'test': 292}
558
+ super_glue_wsc.fixed_p_is_are_r|{'train': 554, 'validation': 104, 'test': 146}
559
+ super_glue_wsc.fixed_p_is_are_r_score_eval|{'train': 1108, 'validation': 208, 'test': 292}
560
+ super_glue_wsc.fixed_replaced_with|{'train': 554, 'validation': 104, 'test': 146}
561
+ super_glue_wsc.fixed_replaced_with_score_eval|{'train': 1108, 'validation': 208, 'test': 292}
562
+ super_glue_wsc.fixed_the_pronoun_refers_to|{'train': 554, 'validation': 104, 'test': 146}
563
+ super_glue_wsc.fixed_the_pronoun_refers_to_score_eval|{'train': 1108, 'validation': 208, 'test': 292}
564
+ trec_fine_grained_ABBR|{'train': 86, 'test': 9}
565
+ trec_fine_grained_ABBR_context_first|{'train': 86, 'test': 9}
566
+ trec_fine_grained_DESC|{'train': 1162, 'test': 138}
567
+ trec_fine_grained_DESC_context_first|{'train': 1162, 'test': 138}
568
+ trec_fine_grained_ENTY|{'train': 1250, 'test': 94}
569
+ trec_fine_grained_HUM|{'train': 1223, 'test': 65}
570
+ trec_fine_grained_HUM_context_first|{'train': 1223, 'test': 65}
571
+ trec_fine_grained_LOC|{'train': 835, 'test': 81}
572
+ trec_fine_grained_LOC_context_first|{'train': 835, 'test': 81}
573
+ trec_fine_grained_NUM|{'train': 896, 'test': 113}
574
+ trec_fine_grained_NUM_context_first|{'train': 896, 'test': 113}
575
+ trec_fine_grained_open|{'train': 5452, 'test': 500}
576
+ trec_fine_grained_open_context_first|{'train': 5452, 'test': 500}
577
+ trec_pick_the_best_descriptor|{'train': 5452, 'test': 500}
578
+ trec_trec1|{'train': 5452, 'test': 500}
579
+ trec_trec2|{'train': 5452, 'test': 500}
580
+ trec_what_category_best_describe|{'train': 5452, 'test': 500}
581
+ trec_which_category_best_describes|{'train': 5452, 'test': 500}
582
+ trivia_qa_unfiltered_first_person_context|{'train': 87622, 'validation': 11313, 'test': 10832}
583
+ trivia_qa_unfiltered_formal_description|{'train': 87622, 'validation': 11313, 'test': 10832}
584
+ trivia_qa_unfiltered_guess_question|{'train': 87622, 'validation': 11313}
585
+ trivia_qa_unfiltered_question_answer|{'train': 87622, 'validation': 11313, 'test': 10832}
586
+ trivia_qa_unfiltered_question_with_instruction|{'train': 87622, 'validation': 11313, 'test': 10832}
587
+ web_questions_get_the_answer|{'train': 3778, 'test': 2032}
588
+ web_questions_potential_correct_answer|{'train': 3778, 'test': 2032}
589
+ web_questions_question_answer|{'train': 3778, 'test': 2032}
590
+ web_questions_short_general_knowledge_q|{'train': 3778, 'test': 2032}
591
+ web_questions_whats_the_answer|{'train': 3778, 'test': 2032}
592
+ wiki_bio_comprehension|{'train': 582639, 'test': 72829, 'val': 72831}
593
+ wiki_bio_guess_person|{'train': 582639, 'test': 72829, 'val': 72831}
594
+ wiki_bio_key_content|{'train': 582639, 'test': 72829, 'val': 72831}
595
+ wiki_bio_what_content|{'train': 582639, 'test': 72829, 'val': 72831}
596
+ wiki_bio_who|{'train': 582639, 'test': 72829, 'val': 72831}
597
+ wiki_hop_original_choose_best_object_affirmative_1|{'train': 43738, 'validation': 5129}
598
+ wiki_hop_original_choose_best_object_affirmative_2|{'train': 43738, 'validation': 5129}
599
+ wiki_hop_original_choose_best_object_affirmative_3|{'train': 43738, 'validation': 5129}
600
+ wiki_hop_original_choose_best_object_interrogative_1|{'train': 43738, 'validation': 5129}
601
+ wiki_hop_original_choose_best_object_interrogative_2|{'train': 43738, 'validation': 5129}
602
+ wiki_hop_original_explain_relation|{'train': 43738, 'validation': 5129}
603
+ wiki_hop_original_generate_object|{'train': 43738, 'validation': 5129}
604
+ wiki_hop_original_generate_subject|{'train': 43738, 'validation': 5129}
605
+ wiki_hop_original_generate_subject_and_object|{'train': 43738, 'validation': 5129}
606
+ wiki_qa_Decide_good_answer|{'train': 20360, 'validation': 2733, 'test': 6165}
607
+ wiki_qa_Direct_Answer_to_Question|{'train': 1040, 'validation': 140, 'test': 293}
608
+ wiki_qa_Generate_Question_from_Topic|{'train': 1040, 'validation': 140, 'test': 293}
609
+ wiki_qa_Is_This_True_|{'train': 20360, 'validation': 2733, 'test': 6165}
610
+ wiki_qa_Jeopardy_style|{'train': 1040, 'validation': 140, 'test': 293}
611
+ wiki_qa_Topic_Prediction_Answer_Only|{'train': 1040, 'validation': 140, 'test': 293}
612
+ wiki_qa_Topic_Prediction_Question_Only|{'train': 1040, 'validation': 140, 'test': 293}
613
+ wiki_qa_Topic_Prediction_Question_and_Answer_Pair|{'train': 1040, 'validation': 140, 'test': 293}
614
+ wiki_qa_automatic_system|{'train': 20360, 'validation': 2733, 'test': 6165}
615
+ wiki_qa_exercise|{'train': 20360, 'validation': 2733, 'test': 6165}
616
+ wiki_qa_found_on_google|{'train': 20360, 'validation': 2733, 'test': 6165}
617
+ winogrande_winogrande_debiased_Replace|{'train': 9248, 'validation': 1267, 'test': 1767}
618
+ winogrande_winogrande_debiased_Replace_score_eval|{'train': 18496, 'validation': 2534, 'test': 3534}
619
+ winogrande_winogrande_debiased_does_underscore_refer_to|{'train': 9248, 'validation': 1267, 'test': 1767}
620
+ winogrande_winogrande_debiased_does_underscore_refer_to_score_eval|{'train': 18496, 'validation': 2534, 'test': 3534}
621
+ winogrande_winogrande_debiased_fill_in_the_blank|{'train': 9248, 'validation': 1267, 'test': 1767}
622
+ winogrande_winogrande_debiased_fill_in_the_blank_score_eval|{'train': 18496, 'validation': 2534, 'test': 3534}
623
+ winogrande_winogrande_debiased_stand_for|{'train': 9248, 'validation': 1267, 'test': 1767}
624
+ winogrande_winogrande_debiased_stand_for_score_eval|{'train': 18496, 'validation': 2534, 'test': 3534}
625
+ winogrande_winogrande_debiased_underscore_refer_to|{'train': 9248, 'validation': 1267, 'test': 1767}
626
+ winogrande_winogrande_debiased_underscore_refer_to_score_eval|{'train': 18496, 'validation': 2534, 'test': 3534}
627
+ winogrande_winogrande_xl_Replace|{'train': 40398, 'validation': 1267, 'test': 1767}
628
+ winogrande_winogrande_xl_Replace_score_eval|{'train': 80796, 'validation': 2534, 'test': 3534}
629
+ winogrande_winogrande_xl_does_underscore_refer_to|{'train': 40398, 'validation': 1267, 'test': 1767}
630
+ winogrande_winogrande_xl_does_underscore_refer_to_score_eval|{'train': 80796, 'validation': 2534, 'test': 3534}
631
+ winogrande_winogrande_xl_fill_in_the_blank|{'train': 40398, 'validation': 1267, 'test': 1767}
632
+ winogrande_winogrande_xl_fill_in_the_blank_score_eval|{'train': 80796, 'validation': 2534, 'test': 3534}
633
+ winogrande_winogrande_xl_stand_for|{'train': 40398, 'validation': 1267, 'test': 1767}
634
+ winogrande_winogrande_xl_stand_for_score_eval|{'train': 80796, 'validation': 2534, 'test': 3534}
635
+ winogrande_winogrande_xl_underscore_refer_to|{'train': 40398, 'validation': 1267, 'test': 1767}
636
+ winogrande_winogrande_xl_underscore_refer_to_score_eval|{'train': 80796, 'validation': 2534, 'test': 3534}
637
+ wiqa_does_the_supposed_perturbation_have_an_effect|{'train': 29808, 'validation': 6894, 'test': 3003}
638
+ wiqa_effect_with_label_answer|{'train': 29808, 'validation': 6894, 'test': 3003}
639
+ wiqa_effect_with_string_answer|{'train': 29808, 'validation': 6894, 'test': 3003}
640
+ wiqa_what_is_the_final_step_of_the_following_process|{'train': 29808, 'validation': 6894, 'test': 3003}
641
+ wiqa_what_is_the_missing_first_step|{'train': 29808, 'validation': 6894, 'test': 3003}
642
+ wiqa_what_might_be_the_first_step_of_the_process|{'train': 29808, 'validation': 6894, 'test': 3003}
643
+ wiqa_what_might_be_the_last_step_of_the_process|{'train': 29808, 'validation': 6894, 'test': 3003}
644
+ wiqa_which_of_the_following_is_the_supposed_perturbation|{'train': 29808, 'validation': 6894, 'test': 3003}
645
+ xsum_DOC_boils_down_to_simple_idea_that|{'train': 204045, 'validation': 11332, 'test': 11334}
646
+ xsum_DOC_given_above_write_one_sentence|{'train': 204045, 'validation': 11332, 'test': 11334}
647
+ xsum_DOC_how_would_you_rephrase_few_words|{'train': 204045, 'validation': 11332, 'test': 11334}
648
+ xsum_DOC_tldr|{'train': 204045, 'validation': 11332, 'test': 11334}
649
+ xsum_DOC_write_summary_of_above|{'train': 204045, 'validation': 11332, 'test': 11334}
650
+ xsum_article_DOC_summary|{'train': 204045, 'validation': 11332, 'test': 11334}
651
+ xsum_college_roommate_asked_DOC_so_I_recap|{'train': 204045, 'validation': 11332, 'test': 11334}
652
+ xsum_read_below_DOC_write_abstract|{'train': 204045, 'validation': 11332, 'test': 11334}
653
+ xsum_summarize_DOC|{'train': 204045, 'validation': 11332, 'test': 11334}
654
+ xsum_summarize_this_DOC_summary|{'train': 204045, 'validation': 11332, 'test': 11334}
655
+ yelp_review_full_based_on_that|{'train': 650000, 'test': 50000}
656
+ yelp_review_full_format_rating|{'train': 650000, 'test': 50000}
657
+ yelp_review_full_format_score|{'train': 650000, 'test': 50000}
658
+ yelp_review_full_format_star|{'train': 650000, 'test': 50000}
659
+ yelp_review_full_on_a_scale|{'train': 650000, 'test': 50000}
660
+ yelp_review_full_so_i_would|{'train': 650000, 'test': 50000}
661
+ yelp_review_full_this_place|{'train': 650000, 'test': 50000}