|
--- |
|
license: apache-2.0 |
|
task_categories: |
|
- robotics |
|
--- |
|
|
|
# Dataset |
|
|
|
<video controls autoplay src="https://cdn-uploads.huggingface.co/production/uploads/6018554e68258223ca22136f/6og2VldKOfp0Ci31h-r_w.mp4"></video> |
|
|
|
This dataset is used to train a transporter network for real-world pick/place within the RAD lab at the University of Edinburgh. The dataset is in TFDS format and was collected using [moveit2_data_collector](https://github.com/peterdavidfagan/moveit2_data_collector). In its current state the dataset is being tested as we are proving out this overall pipeline, keep monitoring this dataset and related repos for documentation updates. |
|
|
|
# Download |
|
An example of downloading and loading the dataset is given below, as larger datasets are uploaded this example script will change: |
|
|
|
```python |
|
import os |
|
import tarfile |
|
|
|
import tensorflow_datasets as tfds |
|
from huggingface_hub import hf_hub_download |
|
|
|
DATA_DIR="/home/robot" |
|
FILENAME="data.tar.xz" |
|
EXTRACTED_FILENAME="data" |
|
FILEPATH=os.path.join(DATA_DIR, FILENAME) |
|
EXTRACTED_FILEPATH=os.path.join(DATA_DIR, EXTRACTED_FILENAME) |
|
|
|
# download data from huggingface |
|
hf_hub_download( |
|
repo_id="peterdavidfagan/transporter_networks", |
|
repo_type="dataset", |
|
filename=FILENAME, |
|
local_dir=DATA_DIR, |
|
) |
|
|
|
# uncompress file |
|
with tarfile.open(FILEPATH, 'r:xz') as tar: |
|
tar.extractall(path=DATA_DIR) |
|
os.remove(FILEPATH) |
|
|
|
# load with tfds |
|
ds = tfds.builder_from_directory(EXTRACTED_FILEPATH).as_dataset()['train'] |
|
|
|
# basic inspection of data |
|
print(ds.element_spec) |
|
for eps in ds: |
|
print(eps["extrinsics"]) |
|
for step in eps["steps"]: |
|
print(step["is_first"]) |
|
print(step["is_last"]) |
|
print(step["is_terminal"]) |
|
print(step["action"]) |
|
``` |