Datasets:
metadata
license: apache-2.0
task_categories:
- robotics
Dataset
This dataset is used to train a transporter network for real-world pick/place. The dataset is in TFDS format and was collected using moveit2_data_collector.
Download
An example of downloading and loading the dataset is given below, as larger datasets are uploaded this example script will change:
import os
import tarfile
import tensorflow_datasets as tfds
from huggingface_hub import hf_hub_download
DATA_DIR="/home/robot"
FILENAME="data.tar.xz"
EXTRACTED_FILENAME="data"
FILEPATH=os.path.join(DATA_DIR, FILENAME)
# download data from huggingface
hf_hub_download(
repo_id="peterdavidfagan/transporter_networks",
repo_type="dataset",
filename=FILENAME,
local_dir=DATA_DIR,
)
# uncompress file
with tarfile.open(FILEPATH, 'r:xz') as tar:
tar.extractall(path=DATA_DIR)
os.remove(FILEPATH)
# load with tfds
ds = tfds.builder_from_directory(DATA_DIR).as_dataset()['train']
# basic inspection of data
print(ds.element_spec)
for eps in ds:
print(eps["extrinsics"])
for step in eps["steps"]:
print(step["is_first"])
print(step["is_last"])
print(step["is_terminal"])
print(step["action"])
Model Training
Please see the robot_learning_baselines repository for examples of training the transporter network architecture in Flax.
Pretrained Models
To be published soon under https://huggingface.co/peterdavidfagan.