ACM SIGCHI General Interest Announcements (Mailing List)


Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Youngwoo Yoon <[log in to unmask]>
Reply To:
Youngwoo Yoon <[log in to unmask]>
Fri, 9 Apr 2021 10:50:59 +0200
text/plain (142 lines)

*Call for papers: GENEA (Generation and Evaluation of Nonverbal Behaviour
for Embodied Agents) Workshop 2021Date:  18–22 October, 2021 (Exact date to
be defined)*


Abstract Registration Due

Jun 12, 2021

Submission Deadline

Jun 15, 2021

Notification Due

Jul 12, 2021

Final Version Due

Aug 3, 2021


Generating nonverbal behaviours, such as gesticulation, facial expressions
and gaze, is of great importance for natural interaction with embodied
agents such as virtual agents and social robots. At present, behaviour
generation is typically powered by rule-based systems, data-driven
approaches, and their hybrids. For evaluation, both objective and
subjective methods exist, but their application and validity are frequently
a point of contention.

This workshop asks “What will be the behaviour-generation methods of the
future? And how can we evaluate these methods using meaningful objective
and subjective metrics?” The aim of the workshop is to bring together
researchers working on the generation and evaluation of nonverbal
behaviours for embodied agents to discuss the future of this field. To
kickstart these discussions, we invite all interested researchers to submit
a paper for presentation at the workshop.

GENEA 2021 is the second GENEA workshop and an official workshop of ACM
ICMI’21, which will take place either in Montreal, Canada, or online.
Accepted submissions will be included in the adjunct ACM ICMI proceedings.

*Paper topics include (but are not limited to) the

   - Automated synthesis of facial expressions, gestures, and gaze movements
   - Audio- and music-driven nonverbal behaviour synthesis
   - Closed-loop nonverbal behaviour generation (from perception to action)
   - Nonverbal behaviour synthesis in two-party and group interactions
   - Emotion-driven and stylistic nonverbal behaviour synthesis
   - New datasets related to nonverbal behaviour
   - Believable nonverbal behaviour synthesis using motion-capture and 4D
   scan data
   - Multi-modal nonverbal behaviour synthesis
   - Interactive/autonomous nonverbal behavior generation
   - Subjective and objective evaluation methods for nonverbal behaviour
   - Guidelines for nonverbal behaviours in human-agent interaction

For papers specifically on the topic of healthcare, whether for generating
or understanding nonverbal behaviours, consider submitting to the workshop
on Socially-Informed AI for Healthcare, also taking place at ICMI’21. The
website of that workshop can be found at:

*Submission guidelines**********************

Please format submissions for double-blind review according to the ACM
conference format.

We will accept long (8 pages) and short (4 pages) paper submissions, along
with posters (3 page papers), all in the double-column ACM conference
format. Pages containing only references do not count toward the page limit
for any of the paper types. Submissions should be made in PDF format
through OpenReview.

*Keynote speakers*

Louis-Philippe Morency (CMU, USA) <>Hatice
Gunesh (University of Cambridge, UK) <>



Taras Kucherenko, KTH Royal Institute of Technology, Sweden

Zerrin Yumak, Utrecht University, The Netherlands

Gustav Eje Henter, KTH Royal Institute of Technology, Sweden

Pieter Wolfert, Ghent University, Belgium

Youngwoo Yoon, ETRI / KAIST, South Korea

Patrik Jonell, KTH Royal Institute of Technology, Sweden

The main contact address of the workshop is: *[log in to unmask]
<[log in to unmask]>*

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to:
     mailto:[log in to unmask]

    To manage your SIGCHI Mailing lists or read our polices see: