Human Action Recognition
via Wearable Motion Sensor Networks
Allen Y. Yang, Annarita Giani, Roberta Giannatonio, Katherine Gilani, Sameer Iyengar,
Philip Kuryloski, Edmund Seto, Ville-Pekka Seppa, Curtis Wang,
Victor Shia, Posu Yan, Roozbeh Jafari, Shankar Sastry, and Ruzena Bajcsy
Notice: It is important that you
read and understand the copyright of the following software packages as
specified in the individual items. The copyright varies with each
package due to its contributor(s). The packages should NOT be used for
any commercial purposes without direct consent of their author(s).
This project is partially supported by NSF TRUST Center at UC Berkeley, ARO MURI W911NF-06-1-0076, Startup Funds from University of Texas at Dallas, Tampere University of Technology, and Telecom Italia Laboratory.
| Level I: Body Sensor Layer
Our wearable sensor network consists of multiple motion sensors placed at multiple body locations and a Wisepla biosensor, which can communicate to a base station attached to either a stationary computer or a mobile computer. The communication module in the sensor nodes and base station uses the commercially available Tmote Sky boards. Tmote Sky runs TinyOS on an 8MHz microcontroller with 10K RAM and communicates using the IEEE 802.15.4 wireless protocol.
Demo Video: Physiological Sensor Design by Tampere University of Technology.
Level II: Personal Network Layer
In this level, a network of heterogeneous wearable sensors is integrated with a Nokia N800 smart phone as a mobile station. The mobile station commicate with the wireless sensors via a Tmote Sky base station connected to the USB port. The software system on the mobile station is designed to locally manage the configuration and status of individual sensors, record sensor measurement w.r.t. an individual carrying the wearable sensors, and further relay the sensor data to stationary servers to support high-level applications.
III: Global Network Layer
Demo video: DexterNet integration with Vanderbilt THIS healthcare database.
Action Recognition Database (WARD) version 1.0
|We construct and maintain a
benchmark database for human action recognition using a wearable motion
sensor network, called WARD. The purpose of WARD is two-fold: 1. A
public and relatively stable data set
provides a platform
for quantitative comparison of the existing algorithms for human action
recognition using wearable motion sensors. 2. The database
should steer the development
innovative algorithms in the area of distributed pattern recognition by
bringing together the investigators from the pattern recognition and
sensor networks communities.
Important Note: The WARD database contains motion sequences with partial missing sensor data mainly due to battery failure and/or network packet loss. Such missing data are indicated as "Inf" in MATLAB. Verification needs to be in place to detect such exceptions when classification is performed.