Datasets:

ArXiv:
Tags:
HAR
angus924 commited on
Commit
0d0cd83
·
verified ·
1 Parent(s): 1a89f33

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -0
README.md ADDED
@@ -0,0 +1,11 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - HAR
4
+ ---
5
+ Part of MONSTER: <https://arxiv.org/abs/2502.15122>.
6
+
7
+ ***Opportunity*** is a comprehensive, multi-sensor dataset designed for human activity recognition in a naturalistic environment [1]. Collected from four participants performing typical daily activities, the dataset spans six recording sessions per person: five unscripted "Activities of Daily Living" (ADL) runs, and one structured "drill" run with specific scripted activities. This dataset includes rich, multi-level annotations; however, for our analysis, we focus specifically on the locomotion classes, which consist of five primary categories: Stand, Walk, Sit, Lie, and Null (no specific activity detected).
8
+
9
+ Data collection includes 113 sensor channels from body-worn, object-attached, and ambient sensors. These channels capture essential information on body movements, object interactions, and environmental contexts through inertial measurement units, accelerometers, and switches. The variety and placement of these sensors allow for detailed examination of physical activities and transitions in a natural setting. To prepare the data for analysis, we segment it using a sliding window approach with a 100 time-step window and an overlap of 50 time steps. This segmentation enables the model to capture both the continuity of activities and subtle transitions, enhancing recognition accuracy across the locomotion classes. The dataset has been divided into cross-validation folds based on individual participants.
10
+
11
+ [1] Ricardo Chavarriaga, Hesam Sagha, Alberto Calatroni, Sundara Tejaswi Digumarti, Gerhard Tröster, José del R Millán, and Daniel Roggen. (2013). The opportunity challenge: A benchmark database for on-body sensor-based activity recognition. Pattern Recognition Letters, 34(15):2033–2042.