File size: 1,054 Bytes
2802dc3
168dbdb
2802dc3
168dbdb
2802dc3
168dbdb
2802dc3
168dbdb
2802dc3
168dbdb
2802dc3
168dbdb
2802dc3
168dbdb
2802dc3
168dbdb
2802dc3
168dbdb
2802dc3
168dbdb
 
 
 
 
2802dc3
168dbdb
2802dc3
168dbdb
2802dc3
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
# FLAN-T5-Large fine-tuned on History Q&A Generation

This model is a fine-tuned version of [google/flan-t5-large](https://huggingface.co/google/flan-t5-large) on a history question-answer dataset.

## Model description

This model is designed to generate multiple-choice questions, answers, and explanations based on historical text inputs.

## Intended uses & limitations

This model is intended for educational purposes and to assist in creating history-related quiz materials.

## Training and evaluation data

The model was trained on the dataset [ambrosfitz/just_history_large_mc](https://huggingface.co/datasets/ambrosfitz/just_history_large_mc).

## Training procedure

The model was trained using the following hyperparameters:
- Number of epochs: 1
- Batch size: 3
- Learning rate: (Add your learning rate here)
- (Add any other relevant hyperparameters)

## Results

Test set results: {'eval_loss': 0.7525317668914795, 'eval_runtime': 102.6275, 'eval_samples_per_second': 19.488, 'eval_steps_per_second': 6.499, 'epoch': 0.9988751406074241}