Datasets:
File size: 1,754 Bytes
eec394d f7e9461 eec394d 5ecd650 8dbc877 b32a994 8dbc877 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 |
---
license: apache-2.0
task_categories:
- robotics
---
# Dataset
<video controls autoplay src="https://cdn-uploads.huggingface.co/production/uploads/6018554e68258223ca22136f/6og2VldKOfp0Ci31h-r_w.mp4"></video>
This dataset is used to train a transporter network for real-world pick/place within the RAD lab at the University of Edinburgh. The dataset is in TFDS format and was collected using [moveit2_data_collector](https://github.com/peterdavidfagan/moveit2_data_collector). In its current state the dataset is being tested as we are proving out this overall pipeline, keep monitoring this dataset and related repos for documentation updates.
# Download
An example of downloading and loading the dataset is given below, as larger datasets are uploaded this example script will change:
```python
import os
import tarfile
import tensorflow_datasets as tfds
from huggingface_hub import hf_hub_download
DATA_DIR="/home/robot"
FILENAME="data.tar.xz"
EXTRACTED_FILENAME="data"
FILEPATH=os.path.join(DATA_DIR, FILENAME)
EXTRACTED_FILEPATH=os.path.join(DATA_DIR, EXTRACTED_FILENAME)
# download data from huggingface
hf_hub_download(
repo_id="peterdavidfagan/transporter_networks",
repo_type="dataset",
filename=FILENAME,
local_dir=DATA_DIR,
)
# uncompress file
with tarfile.open(FILEPATH, 'r:xz') as tar:
tar.extractall(path=DATA_DIR)
os.remove(FILEPATH)
# load with tfds
ds = tfds.builder_from_directory(EXTRACTED_FILEPATH).as_dataset()['train']
# basic inspection of data
print(ds.element_spec)
for eps in ds:
print(eps["extrinsics"])
for step in eps["steps"]:
print(step["is_first"])
print(step["is_last"])
print(step["is_terminal"])
print(step["action"])
``` |