CHI-ANNOUNCEMENTS Archives

ACM SIGCHI General Interest Announcements (Mailing List)

CHI-ANNOUNCEMENTS@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
yonatan vaizman <[log in to unmask]>
Reply To:
yonatan vaizman <[log in to unmask]>
Date:
Fri, 26 Jan 2018 08:56:43 -0800
Content-Type:
text/plain
Parts/Attachments:
text/plain (77 lines)
Dear colleagues.



I’d like to bring to your attention the *ExtraSensory Dataset* & the
*ExtraSensory
App* for research and development of Behavioral Context Recognition (human
activity recognition, context awareness, behavior monitoring…).



The* ExtraSensory Dataset:*
http://extrasensory.ucsd.edu

v  *Large scale*: Over 300k recorded (and labeled) minutes from 60
participants.

v  *Everyday devices*: Sensors from smartphone (iPhone/Android) and
smartwatch.

v  *Diverse sensors*: Accelerometer, gyroscope, magnetometer, audio,
location, watch-acceleration, ambient light, air pressure, and more.

v  *In-the-Wild*: Participants engaged in their regular behavior in their
natural environments.

v  *Rich context*: Self-reported annotations - combinations from a large
vocabulary of context-labels, e.g. walking, running, indoors, at home, at
school, on a bus, driving, shower, toilet, with friends, computer work,
eating, phone in hand, phone in pocket.

v  *Publicly available and free*! We encourage researchers to use this
dataset to develop, evaluate, and compare context-recognition systems.

v  *Testbed for AI / ML*. See the website for example open problems, like
time-series modeling, active learning, feature learning, and more.



The *ExtraSensory App*:
http://extrasensory.ucsd.edu/ExtraSensoryApp

v  *Publicly available and free*! The full source code is available
(including for Android phone, Pebble watch, and the server-side code).

v  *Tool for data collection*: It collects sensor measurements and
self-reported context-labels. The flexible UI provides many self-reporting
method, both in-situ and by-recall. This is an improved version of the app
originally used to collect the ExtraSensory Dataset.

v  *Tool for real-time context-recognition*: The classifier on the
server-side provides real-time probabilities for 51 context-labels. You can
plug in your own classifier. The app was recently used to develop apps that
use the recognized context for music-streaming, automatic-journaling,
auto-tagging phone-pictures, health/lifestyle monitoring and more.



For introduction to Behavioral Context Recognition,
see my lecture at https://www.youtube.com/watch?v=2cuhvEQZ_sI.



Cheers,
Yonatan Vaizman.
http://acsweb.ucsd.edu/~yvaizman

    ---------------------------------------------------------------
    For news of CHI books, courses & software, join CHI-RESOURCES
     mailto: [log in to unmask]

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to
     mailto:[log in to unmask]

    For further details of CHI lists see http://listserv.acm.org
    ---------------------------------------------------------------

ATOM RSS1 RSS2