CHI-ANNOUNCEMENTS Archives

ACM SIGCHI General Interest Announcements (Mailing List)

CHI-ANNOUNCEMENTS@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Condense Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Sender:
"ACM SIGCHI General Interest Announcements (Mailing List)" <[log in to unmask]>
X-To:
Date:
Mon, 25 May 2020 11:23:55 +1000
Reply-To:
Leimin Tian <[log in to unmask]>
Subject:
MIME-Version:
1.0
Message-ID:
Content-Transfer-Encoding:
quoted-printable
Content-Type:
text/plain; charset="UTF-8"
From:
Leimin Tian <[log in to unmask]>
Parts/Attachments:
text/plain (116 lines)
***************************************
ICMI 2020: Final Call for Long and Short Papers
http://icmi.acm.org/2020/index.php?id=cfp
25-29 Oct 2020, Utrecht, The Netherlands
***************************************
Extension for updating paper
Dear all,

We have decided to allow replacing the paper pdf for one more week, that
is, until *June 5 (23:59 GMT-7)*. The original submission deadline (May 29)
stays the same. The title, abstract, and author list *cannot* be changed
after that, since we will start the review bidding process immediately.

Best,
the organizing team of ICMI2020.
***************************************

Call for Long and Short Papers

The 22nd International Conference on Multimodal Interaction (ICMI 2020)
will be held in Utrecht, the Netherlands. ICMI is the premier international
forum for multidisciplinary research on multimodal human-human and
human-computer interaction, interfaces, and system development. The
conference focuses on theoretical and empirical foundations, component
technologies, and combined multimodal processing techniques that define the
field of multimodal interaction analysis, interface design, and system
development.

We are keen to showcase novel input and output modalities and interactions
to the ICMI community. ICMI 2020 will feature a single-track main
conference which includes: keynote speakers, technical full and short
papers (including oral and poster presentations), demonstrations, exhibits
and doctoral spotlight papers. The conference will also feature workshops
and grand challenges. The proceedings of ICMI 2020 will be published by ACM
as part of their series of International Conference Proceedings and Digital
Library.

We also want to welcome conference papers from behavioral and social
sciences. These papers allow us to understand how technology can be used to
increase our scientific knowledge and may focus less on presenting
technical or algorithmic novelty. For this reason, the "novelty" criteria
used during ICMI 2020 review will be based on two sub-criteria (i.e.,
scientific novelty and technical novelty as described below). Accepted
papers at ICMI 2020 only need to be novel on one of these sub-criteria. In
other words, a paper which is strong on scientific knowledge contribution
but low on algorithmic novelty should be ranked similarly to a paper that
is high on algorithmic novelty but low on knowledge discovery.

- Scientific Novelty: Papers should bring some new knowledge to the
scientific community. For example, discovering new behavioral markers that
are predictive of mental health or how new behavioral patterns relate to
children’s interactions during learning. It is the responsibility of the
authors to perform a proper literature review and clearly discuss the
novelty in the scientific discoveries made in their paper.
- Technical Novelty: Papers reviewed with this sub-criterion should include
novelty in their computational approach for recognizing, generating or
modeling data. Examples include: novelty in the learning and prediction
algorithms, in the neural architecture, or in the data representation.
Novelty can also be associated to a new usage of an existing approach.

Please see the Submission Guidelines for Authors
https://icmi.acm.org/2020/index.php?id=authors for detailed submission
instructions.

This year’s conference theme: In this information age, technological
innovation is at the core of our lives and rapidly transforming and
impacting the state of the world in art, culture, and society, and science
as well - the borders between classical disciplines such as humanities and
computer science are fading. In particular, we wonder how multimodal
processing of human behavioural data can create meaningful impact in art,
culture, and society practices. And vice versa, how does art, culture, and
society influence our approaches and techniques in multimodal processing?
As such, this year, ICMI welcomes contributions on our theme for Multimodal
processing and representation of Human Behaviour in Art, Culture, and
Society.

Additional topics of interest include but are not limited to:

- Affective computing and interaction
- Cognitive modeling and multimodal interaction
- Gesture, touch and haptics
- Healthcare, assistive technologies
- Human communication dynamics
- Human-robot/agent multimodal interaction
- Interaction with smart environment
- Machine learning for multimodal interaction
- Mobile multimodal systems
- Multimodal behavior generation
- Multimodal datasets and validation
- Multimodal dialogue modeling
- Multimodal fusion and representation
- Multimodal interactive applications
- Speech behaviors in social interaction
- System components and multimodal platforms
- Visual behaviours in social interaction
- Virtual/augmented reality and multimodal interaction

Important Dates:

*Paper Submission: May 29, 2020 (EXTENDED)*
Reviews to authors: July 15, 2020
Rebuttal due: July 20, 2020
Paper notification: *August 7, 2020*
Camera ready paper: *September 2, 2020*
Presenting at main conference: October 25-29, 2020

    ---------------------------------------------------------------
    For news of CHI books, courses & software, join CHI-RESOURCES
     mailto: [log in to unmask]

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to
     mailto:[log in to unmask]

    For further details of CHI lists see http://listserv.acm.org
    ---------------------------------------------------------------

ATOM RSS1 RSS2