metadata
license: apache-2.0
language:
- en
All of the data together is around 41GB. It's the last hidden states of 131,072 samples from refinedweb padded/truncated to 512 tokens on the left, fed through google/flan-t5-small.
Structure:
{
"encoding": List, shaped (512, 512) aka (tokens, d_model),
"text": String, the original text that was encoded,
"attention_mask": List, binary mask to pass to your model with encoding to not attend to pad tokens
}