CHI-ANNOUNCEMENTS Archives

ACM SIGCHI General Interest Announcements (Mailing List)

CHI-ANNOUNCEMENTS@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
MMCogEmS2011 <[log in to unmask]>
Reply To:
MMCogEmS2011 <[log in to unmask]>
Date:
Thu, 8 Sep 2011 09:50:22 +1000
Content-Type:
text/plain
Parts/Attachments:
text/plain (87 lines)
[Apologies for cross-posting]

In view of the requests received to date, we are extending the paper submission deadline by a week to September 12, 2011.

=======================================================================

MMCogEmS: Inferring Cognitive and Emotional States from Multimodal Measures

http://www.nicta.com.au/workshop_icmi2011

in conjunction with International Conference on Multimodal Interaction (ICMI)

November 17, 2011, Alicante, Spain

Background

Multimodal, as well as unimodal, interactive data from behavioural signals such as speech, facial expression, mouse and pen input has recently been shown to provide indications as to the cognitive and emotional state of the user. Similarly, research into physiological signals such as electroencephalograph (EEG), galvanic skin response (GSR), electromyograph (EMG), heart rate, respiration and eye-movement detected by a camera-based tracker has demonstrated differences for contrasting cognitive states, e.g. the completion of complex or simple tasks. By studying individual physiological, interactive and other behavioural features, and also their fusion/integration, new holistic insights can be provided into the physical manifestations of various cognitive and emotional states - including affect, stress and workload factors.

Scope and Expected Impact

The use of multiple sensors in human-computer interaction (HCI) systems allows the collection of a wide variety of datasets. This, together with an array of machine learning techniques can shed light on implicit behavioural measures, and the kinds of cognitive and emotional states that can give rise to these signal patterns. The amount of data that is being collected now makes this challenge a very topical one. There are no standards by which multimodal data can be collected and labelled (interactive or otherwise), and there are no standards by which multimodal data can be induced and therefore interpreted.

This full-day workshop will provide a unique opportunity to define and shape the future research agenda in unimodal/multimodal cognitive and emotional inference, as a community, acting as a focal point and stimulus for significant new research and collaboration. The development of systems based on these research insights can be expected to have a dramatic effect on future interfaces, opening up a new world of interaction and joint human-computer collaboration that is optimised to exploit the unique capabilities of each. Other related avenues of investigation include exploring the relative merits of different sensors, and correlations between features extracted from different sensor signals.

Guest Speakers

     * Prof. Sharon Oviatt (Incaa Designs, USA)
     * Prof. Peter Gerjets (KMRC, Germany)

Topics of Interest

     * Automatic recognition of cognitive load, stress and emotions via unimodal or multimodal signals
     * New methods and frameworks for the fusion and integration of multimodal signals
     * Methodology for eliciting cognitive and emotional states and design of user experiments
     * Dimensional approach to the estimation of cognitive and emotional states
     * Novel applications of interfaces utilising cognitive and emotional state inference capabilities
     * New sensors and modalities demonstrated to provide cognitive or emotion inference properties

Paper Submission

The workshop solicits original and unpublished papers that address a wide range of issues concerning, but not limiting to the list of topics. Submissions must be sent in PDF to the following email address: [log in to unmask]<mailto:[log in to unmask]>. Authors should submit papers not exceeding 4 pages in total and in the ACM format (http://www.acm.org/sigs/pubs/proceed/template.html).

Accepted papers will be presented at the workshop in either oral or poster format and will appear in the ICMI conference proceedings. Note that accepted papers will not be automatically published in the ACM Digital Library, but selected authors will be invited to submit an extended version of their papers to a journal special issue. At least one author of each paper must register the ICMI conference and attend the workshop to present the paper. Please refer to the ICMI 2011 website for more information about registration.

Important Dates

     Paper submission:        September 12, 2011
     Author notification:      September 28, 2011
     Camera-ready due:       October 10, 2011
     Workshop:                     November 17, 2011

Organisers

     * Fang Chen (NICTA, Australia)
     * Julien Epps (University of New South Wales, Australia)
     * Natalie Ruiz (NICTA, Australia)
     * Eric Choi (NICTA, Australia)

Program Committee

     * Eliathamby Ambikairajah (University of New South Wales, Australia)
     * Bert Arnrich (ETH Zurich, Switzerland)
     * Carlos Busso (The University of Texas (Dallas), USA)
     * Rafael Calvo (University of Sydney, Australia)
     * Paul Corballis (University of Auckland, New Zealand)
     * Henry Gardner (Australian National University, Australia)
     * Roland Goecke (University of Canberra, Australia)
     * Shamsi Iqbal (Microsoft Research, USA)
     * Jongwha Kim (University of Augsburg, Germany)
     * Sungbok Lee (University of Southern California, USA)
     * Nigel Lovell (University of New South Wales, Australia)
     * Saturnino Luz (Trinity College Dublin, Ireland)
     * Bjoern Schuller (TUM, Germany)
     * Andrew Sears (UMBC, USA)
     * Jianhua Tao (Chinese Academy of Sciences (Beijing), P. R. China)

Contact

Please send any queries or comments to [log in to unmask]<mailto:[log in to unmask]>.


    ---------------------------------------------------------------
                To unsubscribe, send an empty email to
     mailto:[log in to unmask]
    For further details of CHI lists see http://listserv.acm.org
    ---------------------------------------------------------------

ATOM RSS1 RSS2