Distributed Human Action Recognition
via Wearable Motion Sensor Networks

Allen Y. Yang, Annarita Giani, Roberta Giannatonio, Katherine Gilani, Sameer Iyengar,
Philip Kuryloski, Edmund Seto, Ville-Pekka Seppa, Curtis Wang,
 Victor Shia, Posu Yan, Roozbeh Jafari, Shankar Sastry, and Ruzena Bajcsy



© Copyright Notice: It is important that you read and understand the copyright of the following software packages as specified in the individual items. The copyright varies with each package due to its contributor(s). The packages should NOT be used for any commercial purposes without direct consent of their author(s).

This project is partially supported by NSF TRUST Center at UC Berkeley, ARO MURI W911NF-06-1-0076, Startup Funds from University of Texas at Dallas, Tampere University of Technology, and Telecom Italia Laboratory.

Project Roadmap


This project seeks solutions to provide long-term monitoring of human motions and associated energy expenditure in normal living environments. In order to support robust coverage of body sensor networks in both indoor and outdoor environments, we propose a scalable wireless communication system. The system consists of three layers:
  1. Body sensor layer (BSL) deals with design of wireless body sensors and their instrumentation.
  2. Personal network layer (PNL) coordinates the body sensors on a SINGLE subject with a mobile base station.
  3. In global network layer (GNL), multiple PNLs interconnected via the Internet provide the wireless coverage for persistent monitoring in both indoor and outdoor environments.
  Level I: Body Sensor Layer

Our wearable sensor network consists of multiple motion sensors placed at multiple body locations and a Wisepla biosensor, which can communicate to a base station attached to either a stationary computer or a mobile computer. The communication module in the sensor nodes and base station uses the commercially available Tmote Sky boards. Tmote Sky runs TinyOS on an 8MHz microcontroller with 10K RAM and communicates using the IEEE 802.15.4 wireless protocol.

Demo Video: Physiological Sensor Design by Tampere University of Technology.

 

Level II: Personal Network Layer

In this level, a network of heterogeneous wearable sensors is integrated with a Nokia N800 smart phone as a mobile station. The mobile station commicate with the wireless sensors via a Tmote Sky base station connected to the USB port. The software system on the mobile station is designed to locally manage the configuration and status of individual sensors, record sensor measurement w.r.t. an individual carrying the wearable sensors, and further relay the sensor data to stationary servers to support high-level applications.



Level III: Global Network Layer

Demo video: DexterNet integration with Vanderbilt THIS healthcare database.


Publications:

  1. Allen Yang, Roozbeh Jarafi, Philip Kuryloski, Sameer Iyengar, Shankar Sastry, and Ruzena Bajcsy, Distributed segmentation and classification of human actions using a wearable motion sensor network, Workshop on Human Communicative Behavior Analysis, CVPR 2008. [PDF]
  2. Allen Yang, Roozbeh Jarafi, Shankar Sastry, and Ruzena Bajcsy, Distributed Recognition of Human Actions Using Wearable Motion Sensor Networks, JAISE, 2009. [PDF]
  3. Philip Kuryloski, Annarita Giani, Roberta Giannantonio, Katherine Gilani, Ville-Pekka Seppa, Edmund Seto, Raffaele Gravina, Victor Shia, Curtis Wang, Posu Yan, Allen Yang, Jari Hyttinen, Shankar Sastry, Stephen Wicker, and Ruzena Bajcsy, DexterNet: An open platform for heterogeneous body sensor networks and its applications, Body Sensor Networks Workshop, 2009. [PDF]
  4. Allen Yang, Philip Kuryloski, Ruzena Bajcsy, WARD: A Wearable Action Recognition Database, CHI Workshop, 2009. [PDF]
  5. Eric Guenterberg, Allen Yang, Roozbeh Jafari, Ruzena Bajcsy, and Shankar Sastry. A Lightweight and Real-Time Fine-Grained Signal Annotation Technique Based on Markov Models in Body Sensor Networks. IEEE Transactions on Information Technology in BioMedicine, 2009. [PDF]
  6. Edmund Seto, Annarita Giani, Victor Shia, Curtis Wang, Posu Yan, Allen Yang, Michael Jerret, Ruzena Bajcsy. A Wireless Body Sensor Network for the Prevention and Management of Asthma. IEEE Symposium on Industrial Embedded Systems (SIES), 2009. [PDF]
  7. Allen Yang, Michael Gastpar, Ruzena Bajcsy, and Shankar Sastry. Distributed Sensor Perception via Sparse Representation. The Proceedings of the IEEE, 2010. [PDF]
  8. Edmund Seto, Eladio Martin, Allen Yang, Posu Yan, Raffaele Gravina, Irving Lin, Curtis Wang, Michael Roy, Victor Shia, and Ruzena Bajcsy. Opportunistic Strategies for Lightweight Signal Processing for Body Sensor Networks. PETRAE, 2010. [PDF]



Benchmark: Wearable Action Recognition Database (WARD) version 1.0

We construct and maintain a benchmark database for human action recognition using a wearable motion sensor network, called WARD. The purpose of WARD is two-fold: 1. A public and relatively stable data set provides a platform for quantitative comparison of the existing algorithms for human action recognition using wearable motion sensors. 2. The database should steer the development of future innovative algorithms in the area of distributed pattern recognition by bringing together the investigators from the pattern recognition and sensor networks communities.
  1. WARD version 1.0: [download] (PLEASE cite our JAISE paper referring to WARD in your publications.)

Last Update: 9-10-2008. This database cannot be used in commercial systems without direct consent of the University of California.

Important Note: The WARD database contains motion sequences with partial missing sensor data mainly due to battery failure and/or network packet loss. Such missing data are indicated as "Inf" in MATLAB. Verification needs to be in place to detect such exceptions when classification is performed.



[Return to Home]