CHI-ANNOUNCEMENTS Archives

ACM SIGCHI General Interest Announcements (Mailing List)

CHI-ANNOUNCEMENTS@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Ronald Böck <[log in to unmask]>
Reply To:
Date:
Thu, 6 May 2021 07:41:09 +0200
Content-Type:
text/plain
Parts/Attachments:
text/plain (97 lines)
**** We apologise for cross-postings ****

**** Please forward this e-mail to potentially interested
students/researchers ****

5th International Workshop on Multimodal Analyses enabling Artificial
Agents in Human-Machine Interaction
MA3HMI 2021

http://ma3hmi.cogsy.de

Scope
One of the aims of building multimodal user interfaces is to make the
interaction between user and system as natural as possible in a
situation as natural as possible. The most natural form of interaction
can be considered how we interact with other humans. While the analysis
of human-human communication has resulted in many insights, transferring
these to human-machine interactions remains challenging. The automated
analysis of the interaction has to consider both semantic and affective
aspects, including personality, mood, or intentions of the user,
anticipating the counterpart. These processes have to be performed in
real-time in order for the system to respond without delays, in a
natural environment.
The MA3HMI workshop series, now at its 5th edition, brings together
researchers working on the analysis of multimodal data as a means to
develop technical devices that can interact with humans and react on
human’s affects. In particular, artificial agents can be regarded in
their broadest sense, including virtual chat agents, empathic speech
interfaces and life-style coaches on a smart-phone. In line with the
main conference’s theme, we focus on ethical aspect including those in
data collection, biases in model development and in the deployment of
systems.

Topics
(a) Multimodal environment analyses
• Multimodal understanding of natural interactions’ situation and
environment to understand the context of affects
• Affect annotation paradigms for user analyses in natural interactions
• Novel strategies of human-machine interaction in terms of situation
and environment
(b) Multimodal user analyses
• Multimodal understanding of user behaviour and affective state in
natural interactions
• Dialogue management using multimodal output
• Multimodal understanding of multiple users behaviour and affects
(c) Applications, Tools and Systems:
• Ethical issues in developing real-time multimodal affective user
interfaces
• Novel application domains and embodied interaction
• User studies with (partial) functional systems

Important dates:
Submission deadline: June 30, 2021
Notification of Acceptance: July 30, 2021
Camera ready: August 20th, 2021
Workshop date: September 28 or October 1th, 2021

Submissions
Prospective authors are invited to submit full papers (8 pages, 7+1
reference) and short papers (5 pages, 4+1 reference) following the ACII
2021 Latex or Word templates, as specified by ACII 2021. All submissions
should be anonymous. Accepted papers will be published in the conference
proceedings.

Venue
Virtual – in conjunction with ACII 2021

Organizers
Ronald Böck, University Magdeburg, Germany
Ronald Poppe, Utrecht University, Netherland
Francesca Bonin, IBM Research Europe


-- 
PD Dr.-Ing. habil. Ronald Böck
FEIT IIKT-Cognitive Systems
Building 03, Room 322
Otto von Guericke University Magdeburg
Universitaetsplatz 2, 39106 Magdeburg, Germany
Phone: +49 391 67 50061
E-mail:
[log in to unmask]
[log in to unmask]
Web: http://www.kognitivesysteme.de





    ----------------------------------------------------------------------------------------
    To unsubscribe from CHI-ANNOUNCEMENTS send an email to:
     mailto:[log in to unmask]

    To manage your SIGCHI Mailing lists or read our polices see:
     https://sigchi.org/operations/listserv/
    ----------------------------------------------------------------------------------------

ATOM RSS1 RSS2