Challenge Site Map
Challenge Home
Prizes and Timelines
Task Statement
Annotations
Sensor Description
Data Formats
Evaluation Metrics
Challenge Rules


The SPHERE Challenge: Activity Recognition with Multimodal Sensor Data


The SPHERE Challenge has Ended! 
SPHERE would like to take this opportunity to thank all participants of the challenge, drivendata.org and the AARP foundation for their participation and interest in this competition. 

A prize was given to the overall winners of the challenge, and the leaderboard of this is given here.

For the participants that attended the ECML-PKDD discovery challenge, a further prize was awarded the the winning submission. A representative of the winners ('MSUHPDM') is shown in the photograph below receiving the winning certificate. 

Winners of SPHERE Challenge ECML-PKDD Discovery Challenge

From left to right, Tom Diethe (challenge organiser), Elio Masciari (challenge organiser), Lei Liu (representative of the MSUHPDM team), and Niall Twomey (challenge organiser). 

Challenge hosted by drivendata.org

We are delighted to announce that the SPHERE Challenge, which takes place in conjunction with ECML-PKDD 2016, is now hosted at DrivenData.  All challenge features (including the leaderboard, forum, and user/team registration) are now managed at the drivendata website: 

https://www.drivendata.org/competitions/42/senior-data-science-safe-aging-with-sphere/

  • Solution Proposal Deadline: July 19 2016 24:00 - As long as it is July 19-th anywhere in the world (Time Zone in Midway, US Minor Outlying Islands, UTC-12)
  • Paper submission deadline: July 8 2016 (all participants welcome to submit, selected Teams will be invited to submit their solution to the challenge workshop)
  • Notification: July 25 2016









The SPHERE Project

Obesity, depression, stroke, falls, cardiovascular and musculoskeletal disease are some of the biggest health issues and fastest-rising categories of health-care costs. The financial expenditure associated with these is widely regarded as unsustainable and the impact on quality of life is felt by millions of people in the UK each day. Smart technologies can unobtrusively quantify activities of daily living, and these can provide long-term behavioural patterns that are objective, insightful measures for clinical professionals and caregivers.

To this end the EPSRC-funded “Sensor Platform for HEalthcare in Residential Environment (SPHERE)” Interdisciplinary Research Collaboration (IRC) has designed a multi-modal sensor system driven by data analytics requirements. The system is under test in a single house, and will be deployed in a general population of 100 homes in Bristol (UK). The data sets collected will be made available to researchers in a variety of communities. 

Data is collected from the following three sensing modalities:

  1. wrist-worn accelerometer; 
  2. RGB-D cameras (i.e. video with depth information); and
  3. passive environmental sensors.

With these sensor data, we can learn patterns of behaviour, and can track the deterioration/progress of persons that suffer or recover from various medical conditions. To achieve this, we focus activity recognition over multiple tiers, with the two main prediction tasks of SPHERE including:

  1. prediction of Activities of Daily Living (ADL) (e.g. tasks such as meal preparation, watching television); and
  2. prediction of posture/ambulation (e.g. walking, sitting, transitioning).

Reliable predictions of ADL allows us to model behaviour and of residents over time, e.g. what does a typical day consist of, what times are particular activities performed etc. Prediction of posture and ambulation will complement ADL predictions, and can inform us about the physical well-being of the participant, how mobile/responsive is the participant, how activie/sedintary, etc.

The SPHERE Challenge

The task for the SPHERE challenge is to predict posture and ambulation labels given the sensor data from the recruited participants. 

We will henceforth refer to posture/ambulation as ‘activity recognition’ for brevity. It is worth noting that the term activity recognition has different interpretations when viewed from accelerometer, video, and environmental sensor perspectives. The definition of activities used in challenge most closely aligns to the terminology used in the field of accelerometer-based prediction. 

For this task, accelerometer, RGB-D and environmental data is provided. Accelerometer is samplled at 20 Hz and given in its raw format (see Section 2.1). Raw video is not given in order to preserve anonymity of the participants. Instead, extracted features that relate to the centre of mass and bounding box of the identified persons are provided (see Section 2.2). Environmental data consists of Passive Infra-Red (PIR) sensors, and these is given in raw format (see Section 2.3).

Twenty (posture/ambulation) activities labels are annotated in our dataset, and these are enumerated below together with short descriptions: 

  1. a_ascend: ascent stairs;
  2. a_descend: descent stairs;
  3. a_jump: jump;
  4. a_loadwalk: walk with load;
  5. a_walk: walk;
  6. p_bent: bending;
  7. p_kneel: kneeling;
  8. p_lie: lying;
  9. p_sit: sitting;
  10. p_squat: squatting;
  11. p_stand: standing;
  12. t_bend: stand-to-bend;
  13. t_kneel stand: kneel-to-stand;
  14. t_lie_sit: lie-to-sit;
  15. t_sit_lie: sit-to-lie;
  16. t_sit_stand: sit-to-stand;
  17. t_stand_kneel: stand-to-kneel;
  18. t_stand_sit: stand-to-sit;
  19. t_straighten: bend-to-stand; and
  20. t_turn: turn

The prefix ‘a_’ on a label indicates an ambulation activity (i.e. an activity requiring of continuing movement), the prefix ‘p_’ indicate static postures (i.e. times when the participants are stationary), and the prefix ‘t_’ indicate posture-to-posture transitions. These labels are the target variables that are to be predicted in the challenge.

The SPHERE challenge will take place in conjunction with the ECML-PKDD4 conference: 

Scripted Data: The first stage of the challenge uses sensor data that was recorded when the participants performed a pre-defined script. This script is described in later sections of this document. The script can be downloaded from here.

Competition closes on June 19 2016 24:00. Paper submission deadline: July 8 2016.

Participation

Please participate on drivendata.org 

https://www.drivendata.org/competitions/sphere

For the full set of rules governing participation and eligilibility, see the rules page. 

Getting the data

Please download the data from drivendata: 

https://www.drivendata.org/competitions/sphere

Using this Data

The paper below provides a full description of the data and the data formats, and and any use of the data must cite this:

Reference:
Niall Twomey, Tom Diethe, Meelis Kull, Hao Song, Massimo Camplani, Sion Hannuna, Xenofon Fafoutis, Ni Zhu, Pete Woznowski, Peter Flach, and Ian Craddock. The SPHERE Challenge: Activity Recognition with Multimodal Sensor Data. 2016.

BibTeX:
@article{twomey2016sphere, title={The {SPHERE} Challenge: Activity Recognition with Multimodal Sensor Data}, author={Twomey, Niall and Diethe, Tom and Kull, Meelis and Song, Hao and Camplani, Massimo and Hannuna, Sion and Fafoutis, Xenofon and Zhu, Ni and Woznowski, Pete and Flach, Peter and Craddock, Ian}, journal={arXiv preprint arXiv:1603.00797}, year={2016}}

Contact Information

For any questions regarding the data, please contact: spherechallengeecml[at]gmail.com