legaltokenized1024 / README.md
vegeta's picture
Upload README.md with huggingface_hub
a08f776
|
raw
history blame
497 Bytes
---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 27016370584
num_examples: 5268403
- name: validation
num_bytes: 2947948744
num_examples: 574873
download_size: 7022414209
dataset_size: 29964319328
---
# Dataset Card for "legaltokenized1024"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)