CHI-ANNOUNCEMENTS Archives

ACM SIGCHI General Interest Announcements (Mailing List)

CHI-ANNOUNCEMENTS@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Merel M Jung <[log in to unmask]>
Reply To:
Date:
Wed, 1 Apr 2015 06:59:22 +0000
Content-Type:
text/plain
Parts/Attachments:
text/plain (92 lines)
-----------------------------------------------------------------------
===========================
CALL FOR PARTICIPATION
===========================

Recognition of Social Touch Gestures Challenge 2015


The Grand Challenge workshop will be held on November 9th in conjunction
with the 17th ACM International Conference on Multimodal Interaction
(ICMI'15), Seattle, Washington, USA.

Conference website: icmi.acm.org/2015

To participate in the challenge please send an email with you name(s)
and affiliation(s) to [log in to unmask]
Registration is due by May 1st.

Challenge homepage: http://www.utwente.nl/touch-challenge-2015


===========================
Description
===========================
Touch is one of the important non-verbal forms of social interaction,
where it is used to communicate emotions and social messages. Automatic
recognition of social touch is necessary to apply interaction in the
tactile modality to areas such as Human-Robot Interaction (HRI). If a
robot can understand social touch behavior, the robot can respond
accordingly - resulting in richer and more natural interaction.

In this challenge the focus is on the recognition of different hand
touch gestures for which two data sets will be made available, each
with labeled pressure/ location data collected from similar matrix-type
sensor grids under conditions reflecting different application
orientations. (1) CoST: Corpus of Social Touch (Jung et al., 2014)
includes 14 relevant touch gestures such as stroke, poke and hit
performed in three variations: normal, gentle and rough performed on a
mannequin arm (2) HAART: The Human-Animal Affective Robot Touch gesture
set (Flagg and MacLean, 2013) includes 7 touch gestures which humans
use to communicate emotion in animal interaction (Yohanan and MacLean,
2012), with a range of covering, substrate and curvature. Participants
can choose to work on one of the data sets or on both. The purpose of
this challenge is to develop relevant features and classification
methods for recognizing social touch gestures. Participants will share
their innovative findings at the ACM International Conference on
Multimodal Interaction (ICMI '15) in Seattle, USA. Accepted papers will
be included in the challenge proceedings of ICMI'15.


===========================
Important Dates
===========================
Release of training data: April 13th, 2015
Registration deadline: May 1st, 2015
Release of test data (without labels): May 18th, 2015
Submission of test set labels: May 25th, 2015
Performance results and release of test set labels: June 1st, 2015
Submission of paper: August 1st, 2015
Notification of acceptance: August 20th, 2015
Camera-ready submissions: September 15th, 2015
Challenge workshop: November 9th, 2015


===========================
Organizers
===========================
Merel Jung (University of Twente, The Netherlands)
Mannes Poel (University of Twente, The Netherlands)
Jeff Allen (University of British Columbia, Canada)
Laura Cang (University of British Columbia, Canada)

Contact: [log in to unmask]


-----------------------------------------------------------------------

---------------------------
Merel Jung - PhD Candidate
Human Media Interaction Group
University of Twente - The Netherlands

    ---------------------------------------------------------------
    For news of CHI books, courses & software, join CHI-RESOURCES
     mailto: [log in to unmask]

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to
     mailto:[log in to unmask]

    For further details of CHI lists see http://listserv.acm.org
    ---------------------------------------------------------------

ATOM RSS1 RSS2