CHI-ANNOUNCEMENTS Archives

ACM SIGCHI General Interest Announcements (Mailing List)

CHI-ANNOUNCEMENTS@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Condense Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Sender:
"ACM SIGCHI General Interest Announcements (Mailing List)" <[log in to unmask]>
X-To:
Date:
Tue, 12 Feb 2019 04:45:49 +0000
Reply-To:
zakia hammal <[log in to unmask]>
Subject:
MIME-Version:
1.0
Message-ID:
Content-Transfer-Encoding:
quoted-printable
Content-Type:
text/plain; charset=UTF-8
From:
zakia hammal <[log in to unmask]>
Parts/Attachments:
text/plain (72 lines)
 Apologies for cross-posting

***********************************************************************************
ICMI 2019: Call for Long and Short Papers
https://icmi.acm.org/2019/index.php?id=cfp

Abstract Submission:  May 1, 2019 (11:59pm PST)
Final Submission:       May 7, 2019 (11:59pm PST)
***********************************************************************************

Call for Long and Short Papers

 The 21st International Conference on Multimodal Interaction (ICMI 2019) will be held in Suzhou, China. ICMI is the premier international forum for multidisciplinary research on multimodal human-human and human-computer interaction, interfaces, and system development. The conference focuses on theoretical and empirical foundations, component technologies, and combined multimodal processing techniques that define the field of multimodal interaction analysis, interface design, and system development.

We are keen to showcase novel input and output modalities and interactions to the ICMI community. ICMI 2019 will feature a single-track main conference which includes: keynote speakers, technical full and short papers (including oral and poster presentations), demonstrations, exhibits and doctoral spotlight papers. The conference will also feature workshops and grand challenges. The proceedings of ICMI 2019 will be published by ACM as part of their series of International Conference Proceedings and Digital Library.

We also want to welcome conference papers from behavioral and social sciences. These papers allow us to understand how technology can be used to increase our scientific knowledge and may focus less on presenting technical or algorithmic novelty. For this reason, the "novelty" criteria used during ICMI 2019 review will be based on two sub-criteria (i.e., scientific novelty and technical novelty as described below). Accepted papers at ICMI 2019 only need to be novel on one of these sub-criteria. In other words, a paper which is strong on scientific knowledge contribution but low on algorithmic novelty should be ranked similarly to a paper that is high on algorithmic novelty but low on knowledge discovery. 

    Scientific Novelty: Papers should bring some new knowledge to the scientific community. For example, discovering new behavioral markers that are predictive of mental health or how new behavioral patterns relate to children’s interactions during learning. It is the responsibility of the authors to perform a proper literature review and clearly discuss the novelty in the scientific discoveries made in their paper.
    Technical Novelty: Papers reviewed with this sub-criterion should include novelty in their computational approach for recognizing, generating or modeling data. Examples include: novelty in the learning and prediction algorithms, in the neural architecture, or in the data representation. Novelty can also be associated to a new usage of an existing approach.

Please see the Submission Guidelines for Authors for detailed submission instructions.

This year, ICMI welcomes contributions on our theme of multi-modal understanding of multi-party interactions. Additional topics of interest include but are not limited to:

    Affective computing and interaction
    Cognitive modeling and multimodal interaction
    Gesture, touch and haptics
    Healthcare, assistive technologies
    Human communication dynamics
    Human-robot/agent multimodal interaction
    Interaction with smart environment
    Machine learning for multimodal interaction
    Mobile multimodal systems
    Multimodal behavior generation
    Multimodal datasets and validation
    Multimodal dialogue modeling
    Multimodal fusion and representation
    Multimodal interactive applications
    Speech behaviors in social interaction
    System components and multimodal platforms
    Visual behaviors in social interaction
    Virtual/augmented reality and multimodal interaction

Important Dates:

Abstract Submission 	May 1, 2019
Final submissions 	May 7, 2019
Paper rebuttal due 	June 25, 2019
Autdor notification 	July 7, 2019
Paper Camera Ready 	July 15, 2019

Best regards,
Social Media Chair ICMI 2019

Zakia Hammal, PhD
The Robotics Institute, Carnegie Mellon
University

http://www.ri.cmu.edu/
http://ri.cmu.edu/personal-pages/ZakiaHammal/

    ---------------------------------------------------------------
    For news of CHI books, courses & software, join CHI-RESOURCES
     mailto: [log in to unmask]

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to
     mailto:[log in to unmask]

    For further details of CHI lists see http://listserv.acm.org
    ---------------------------------------------------------------

ATOM RSS1 RSS2