ACM SIGCHI General Interest Announcements (Mailing List)


Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Kostas Karpouzis <[log in to unmask]>
Reply To:
Kostas Karpouzis <[log in to unmask]>
Wed, 7 Jan 2009 11:54:08 +0200
text/plain (118 lines)
1st Call for Papers

Real-Time Affect Analysis and Interpretation:
Closing the Affective Loop in Virtual Agents and Robots

Special issue of the Journal on Multimodal User Interfaces

Guest editors:

Ginevra Castellano, Queen Mary University of London, Department of 
Computer Science, School of Electronic Engineering and Computer Science, 
United Kingdom; [log in to unmask]

Kostas Karpouzis, Institute of Communication and Computer Systems,
National Technical University of Athens, Greece; [log in to unmask]

Christopher Peters, Department of the Digital Environment,
Coventry University, United Kingdom; [log in to unmask]

Jean-Claude Martin, LIMSI-CNRS, Orsay, France; [log in to unmask]

Deadline for Paper Submission: ***6th April 2009***

This special issue will address computational models and techniques for 
the real-time interpretation of the user's behavior to produce mid- or 
high-level state descriptors, from basic emotions to more complex 
appraisals or mental states (e.g. agreement and interest, or blends of 
several emotions) for the purpose of closing the affective loop in 
social robots and virtual agents.

A vital requirement for social robots and virtual agents is the ability 
to infer the affective and mental states of humans, so as to be able to 
engage in and behave appropriately during sustained social interactions. 
Examples include ensuring that the user is interested in maintaining the 
interaction or providing suitable empathic responses. A fundamental 
component in these 'mentalizing' and 'empathizing' capabilities is the 
interpretation of human behavior from sensory input, which must be 
conducted in a timely manner. Researchers in multimodal interfaces have 
been increasingly addressing the design of systems endowed with these 
abilities. Nevertheless, only a few attempts have been made towards the 
development of virtual agents and robots capable of inferring the user's 
states in real-time.

The focus of this special issue is on real-time computational techniques 
for the recognition and interpretation of human verbal and non-verbal 
behavior, models of 'mentalizing' and 'empathizing' for integrative 
representation and processing of input data, and implementation to 
support human-agent and human-robot interaction frameworks.

***Important Dates***

- Deadline for paper submission: 6th April 2009
- Notification of acceptance: 11th May 2009
- Camera-ready version of accepted papers: 8th June 2009
- Publication date: July/August 2009

Topics to be addressed include, but are not limited to:

- Multimodal affect recognition (facial expressions, body language, speech,
   biosignals, typed text, users' actions etc.)

- Perception-action loops in agents/robots

- Affect sensitive and socially interactive agents/robots

- Cognitive and affective mentalizing / theory of mind

- Social appraisal

- Visual attention

- Theories of emotion

- Emotion and cognitive state representation

- Context awareness

- Cognitive modeling of user

- Individual differences in the expression and perception of affect

- User engagement

- Evaluation of affective interaction and user-centred design

- Applications: interactive games, empathic interfaces, pedagogical 
agents, health care, etc.

***Instructions for Authors***

Submissions should be 4 to 12 pages long and must be written in English.

Formatting instructions and templates are available on:

Authors should register and upload their submission on the following 

During the submission process, please select 'AFFINE special issue' as 
article type.

Authors are encouraged to send to: [log in to unmask] a brief email 
indicating their intention to participate as soon as possible, including 
their contact information and the topic they intend to address in their 

                To unsubscribe, send an empty email to
     mailto:[log in to unmask]
    For further details of CHI lists see