ACM SIGCHI General Interest Announcements (Mailing List)


Options: Use Forum View

Use Monospaced Font
Show HTML Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
"Ghosh, Ayan" <[log in to unmask]>
Reply To:
Ghosh, Ayan
Mon, 15 May 2017 22:12:03 +0000
text/plain (115 lines)

The international full day workshop:

"Interaction with agents and robots: Different Embodiments, Common Challenges"

In conjunction with the 17th International Conference on Intelligent Virtual Agents (IVA) - Lisbon - Stockholm - August 27 2017


Aim and Scope

As hardware and enabling technologies such as speech recognition improve, virtual agents and robots are increasingly able to autonomously take part in “face-to-face” interactions with people. But engaging in these types of interactions successfully requires the complex coordination of verbal and nonverbal behaviour, as well as the ability to respond to the verbal and/or nonverbal cues of a human interaction partner.

Researchers in the field of virtual agents and human-robot interactions have been concerned with these problems, and these fields provide many examples of implemented software systems that can give useful insights for researchers of the two communities. However, while the HRI and agents communities share research interests, there are differences in methodology and focus. This workshop seeks to increase communication and knowledge sharing between these communities. What can the agents community learn from HRI about the evaluation of people’s attitudes towards and acceptance of embodied agents? And what can the HRI community learn from the agents' community about designing control architectures for embodied conversation?

Research on embodiment in agents suggests that whether an agent is embodied virtually or in hardware can influence how people respond during interaction. But in many real world applications such as assistance in domestic environments, it may be most practical for people to interact with both robots and software agents under different conditions. How to design and realise these systems of hybrid embodiment, as well as how to study people's impressions of them are open questions for research. Should differing embodiments have distinct agency/personalities? Or should an agent migrate across them while interacting with a user?

The goal of this workshop is to enable a cross fertilisation of ideas and solutions for the issues encountered when attempting to generate behavior coordination between machines and humans, that can be used independently from the embodiment of the agent. Similarly, we aim to better understand the limitations and benefits of physical or virtual embodiment for different interaction scenarios. This approach will enable a discussion about the nature of the differences between software and robotic agents, not only with respect to their construction and scientific use, but also to their fields of application.

Keynote Speakers

Dr. Catherine Pelachaud – CNRS, Institute for Intelligent Systems and Robotics, Paris

Dr. Ruth Aylett –Heriot-Watt University Edinburgh


The papers should be submitted to [log in to unmask]<mailto:[log in to unmask]> in PDF format. Accepted submissions will be made available on the workshop website.

The papers should be formatted according to the requirements for Springer’s Lecture Notes in Computer Science (LNCS)

Submitted papers should be limited to 2-6 pages maximum.

 The primary list of topics covers the following points (but not limited to):

•             synchronizing verbal and nonverbal behaviour

•             AI for autonomous interaction dialogue management for embodied agents and robots

•             design of hybrid and/or “migrating” embodiments

•             producing expressive gestures

•             evaluating embodied interaction

•             fostering long term acceptance of and attachment to embodied agents

•             expression and perception of socio-emotional states

•             multi-party human-robot-agent interaction

•             blending realities: robots and agents understanding and reacting to virtual and real events

Important Dates

Paper submission: 1-July-2017

Notification of acceptance: 15-July-2017

Camera-ready version: 1 August-July-2017

Workshop: 27-August-2017


 Mathieu Chollet - USC Institute for Creative Technologies, USA

Ayan Ghosh - Heriot Watt University, UK

Hagen Lehmann - Italian Institute of Technology, Italy

Yukiko Nakano - Seikei University, Japan


Founded in 1821, Heriot-Watt is a leader in ideas and solutions. With campuses and students across the entire globe we span the world, delivering innovation and educational excellence in business, engineering, design and the physical, social and life sciences.

This email is generated from the Heriot-Watt University Group, which includes:

  1.  Heriot-Watt University, a Scottish charity registered under number SC000278
  2.  Edinburgh Business School a Charity Registered in Scotland, SC026900. Edinburgh Business School is a company limited by guarantee, registered in Scotland with registered number SC173556 and registered office at Heriot-Watt University Finance Office, Riccarton, Currie, Midlothian, EH14 4AS
  3.  Heriot- Watt Services Limited (Oriam), Scotland's national performance centre for sport. Heriot-Watt Services Limited is a private limited company registered is Scotland with registered number SC271030 and registered office at Research & Enterprise Services Heriot-Watt University, Riccarton, Edinburgh, EH14 4AS.

The contents (including any attachments) are confidential. If you are not the intended recipient of this e-mail, any disclosure, copying, distribution or use of its contents is strictly prohibited, and you should please notify the sender immediately and then delete it (including any attachments) from your system.

    For news of CHI books, courses & software, join CHI-RESOURCES
     mailto: [log in to unmask]

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to
     mailto:[log in to unmask]

    For further details of CHI lists see