ACM SIGCHI General Interest Announcements (Mailing List)


Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Condense Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
text/plain; charset="utf-8"; format=flowed
Tue, 10 May 2016 08:40:46 +0200
"ACM SIGCHI General Interest Announcements (Mailing List)" <[log in to unmask]>
Ronald Böck <[log in to unmask]>
text/plain (102 lines)
**** We apologise for cross-postings ****
**** Please forward this e-mail to potentially interested 
students/researchers ****

3rd International Workshop on
Multimodal Analyses enabling Artificial Agents in Human-Machine Interaction
(MA3HMI 2016)

November 16th, 2016 in Tokyo, Japan.
In conjunction with ICMI2016.

One of the aims in building multimodal user interfaces and combining 
them with technical devices is to make the interaction between user and 
system as natural as possible. The most natural form of interaction may 
be how we interact with other humans. Current technology is far from 
human-like, and systems can reflect a wide range of technical solutions.
Transferring the insights for analysis of human-human communication to 
human-machine interactions remains challenging. It requires that the 
multimodal inputs from the user (e.g., speech, gaze, facial expressions) 
are recorded and interpreted. This interpretation has to occur at both 
the semantic and affective levels, including aspects such as the 
personality, mood, or intentions of the user. These processes have to be 
performed in real-time in order for the system to respond without delays 
ensuring that the interaction is smooth.
The MA3HMI workshop aims at bringing together researchers working on the 
analysis of multimodal data as a means to develop technical devices that 
can interact with humans. In particular, artificial agents can be 
regarded in their broadest sense, including virtual chat agents, 
empathic speech interfaces and life-style coaches on a smart-phone. More 
general, multimodal analyses support any technical system in the 
research area of human-machine interaction. We focus on the real-time 
aspects of human-machine interaction. We address the development and 
evaluation of multimodal, real-time systems.
We solicit papers that concern the different phases of the development 
of such interfaces. Tools and systems that address real-time 
conversations with artificial agents and technical systems are also 
within the scope of the workshop.

Topics (but not limited to):
a) Multimodal Annotation
- Representation formats for merged multimodal annotations
- Best practices for multimodal annotation procedures
- Innovative multimodal annotation schemes
- Annotation and processing of multimodal data sets
- Real-time or on-the-fly annotation approaches
b) Multimodal Analyses
- Multimodal understanding of user behavior and affective state
- Dialogue management using multimodal output
- Evaluation and benchmarking of humanmachine conversations
- Novel strategies of human-machine interactions
- Using multimodal data sets for human-machine interaction
c) Applications, Tools, and Systems
- Novel application domains and embodied interaction
- Prototype development and uptake of technology
- User studies with (partial) functional systems
- Tools for the recording, annotation and analysis of conversations

Important Dates:
Submission Deadline: August 28th, 2016
Notification of Acceptance: October 2nd, 2016
Camera-ready Deadline: October 9th, 2016 (fixed date)
Workshop Date: November 16th, 2016

Prospective authors are invited to submit full papers (8 pages) and 
short papers (5 pages) in ACM format as specified by ICMI 2016. Accepted 
papers will be published as post-proceedings in the ACM Digital Library. 
All submissions should be anonymous.

Ronald Böck, University Magdeburg, Germany
Francesca Bonin, IBM Research, Ireland
Nick Campbell, Trinity College Dublin, Ireland
Ronald Poppe, Utrecht University, Netherland

Dr.-Ing. Dipl.-Inf. Ronald Böck
Cognitive Systems Group
Building 03, Room 322
Otto von Guericke University Magdeburg
Universitaetsplatz 2, 39106 Magdeburg Germany
Phone: +49 391 - 67 50061
[log in to unmask]
[log in to unmask]

    For news of CHI books, courses & software, join CHI-RESOURCES
     mailto: [log in to unmask]

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to
     mailto:[log in to unmask]

    For further details of CHI lists see