ACM SIGCHI General Interest Announcements (Mailing List)


Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Kostas Karpouzis <[log in to unmask]>
Reply To:
Kostas Karpouzis <[log in to unmask]>
Fri, 29 Feb 2008 17:28:22 +0200
text/plain (87 lines)
			Call for papers
		 	  Workshop on
      Affect-aware Human-Computer and Human-Robot Interaction
        during the 1st International Conference on PErvasive
     Technologies Related to Assistive environments (PETRA'08)
		Athens, Greece July 15-19, 2008
	   Conference website:

Recent research results in the area of multimodal interfaces have been
focusing on the development of natural, adaptive and intelligent
interfaces enabling machines and robots to communicate with humans in
ways close to human-to-human interaction. Natural interfaces, catering
for visual, speech, and gestural input and human-like interface agents
and robots are the two main characteristics of the dominant approaches;
in terms of theoretical definition, such interfaces are beginning to
take into account emotion representation theories and definitions of
virtual personalities.
As a result, systems are developed which take into account user 
expressivity and have robust natural multimodal interaction 
capabilities. In addition to computer systems, affect-aware robotic 
platforms are currently emerging even into the consumer market.
Not long ago, the social impact of these products in everyday life was
largely unknown. However, with the advent of low-cost robots, able to
receive, transmit and process visual, motion and sound information,
interface agents and assistants have taken an embodied form as well.
This workshop is based on themes from successfully completed and current
European projects in the fields of interfaces and robotics, such as NoE
Humaine, IP Callas, and STREP Feelix-Growing.

Topics include, but are not limited to:
• Affect-aware interfaces
• Ambient/natural interfaces
• Adaptive/cooperative interfaces
• Interfaces for attentive and intelligent environments
• Signal acquisition and processing for handicapped people
• Information/Sensor Fusion techniques and Architectures
• Machine Learning for human-computer and human-robot interaction
• Multimodal recognition of affect
• Affect recognition ‘in the wild’
• Theory of Emotion/Emotion Representation
• Cognitive modeling of users
• Virtual/augmented environments for handicapped people
• Affect-aware e-learning applications
• Applications of Affective Robots
• Emerging affect-aware Standards

Elisabeth Andre, Univ. of Augsburg, Germany
Kostas Karpouzis, ICCS/NTUA, Greece
Maja Pantic, Imperial College, UK
Catherine Pelachaud, Universite de Paris 8, France

Lola Canamero, University of Hertfordshire, UK
Philippe Gaussier, Cergy-Pontoise University/ENSEA, France
Stefanos Kollias, National Technical Univ. of Athens, Greece
Ilias Maglogiannis, Univ. of Aegean, Greece
Paolo Petta, OFAI, Austria
John Soldatos, Athens Information Technology, Greece

Paper submission: March 30, 2008
Notification of acceptance: April 30, 2008
Camera ready papers due: May 20, 2008

Prospective authors should submit their contribution by e-mail to
[log in to unmask], under the instructions described in At least one author of each
accepted paper is required to register with the workshop and present the
paper. Authors should use the ACM style templates to prepare their 
articles (
Submitted papers should be up to 8 pages. Papers may be accepted either 
as full papers (8 pages) or short papers (4 pages).
Accepted papers will be presented at the conference and published in the 
Proceedings of the Conference. ACM will be the publisher of the 
proceedings of the PETRA conference and the conference proceedings will 
be a volume in the ACM International Conference Proceedings Series in 
the ACM Digital Library.

                To unsubscribe, send an empty email to
     mailto:[log in to unmask]
    For further details of CHI lists see