Datasets:

ArXiv:
License:
RT-Pose / README.md
merve's picture
merve HF staff
Update task tag
e909046 verified
|
raw
history blame
3.02 kB
metadata
license: cc-by-nc-sa-4.0
size_categories:
  - 1K<n<10K
task_categories:
  - keypoint-estimation

Paper

RT-Pose: A 4D Radar Tensor-based 3D Human Pose Estimation and Localization Benchmark (ECCV 2024)

RT-Pose introduces a human pose estimation (HPE) dataset and benchmark by integrating a unique combination of calibrated radar ADC data, 4D radar tensors, stereo RGB images, and LiDAR point clouds. This integration marks a significant advancement in studying human pose analysis through multi-modality datasets.

images images

Dataset Details

Dataset Description

Sensors

The data collection hardware system comprises two RGB cameras, a non-repetitive horizontal scanning LiDAR, and a cascade imaging radar module. images

Data Statics

We collect the dataset in 40 scenes with indoor and outdoor environments. images

The dataset comprises 72,000 frames distributed across 240 sequences. The structured organization ensures a realistic distribution of human motions, which is crucial for robust analysis and model training.

images

Please check the paper for more details.

Dataset Sources

  • Repository including data processing and baseline method codes: RT-POSE
  • Paper: Paper

Uses

  1. Download the dataset from Hugging Face (Total data size: ~1.2 TB)
  2. Follow the data processing tool to process radar ADC samples into radar tensors. (Total data size of the downloaded data and saved radar tensors: ~41 TB)
  3. Check the data loading and baseline method's training and testing codes in the same repo RT-POSE

Citation

BibTeX:

@article{rtpose2024, title={RT-Pose: A 4D Radar Tensor-based 3D Human Pose Estimation and Localization Benchmark}, author={Yuan-Hao Ho and Jen-Hao Cheng and Sheng Yao Kuan and Zhongyu Jiang and Wenhao Chai and Hsiang-Wei Huang and Chih-Lung Lin and Jenq-Neng Hwang}, journal={arXiv preprint arXiv:2407.13930}, year={2024} }

Dataset Card Contact

Jen-Hao (Andy) Cheng, [email protected]