ACM SIGCHI General Interest Announcements (Mailing List)


Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Condense Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
text/plain; charset=us-ascii
Thu, 22 Sep 2005 13:40:15 +1000
Rod Farmer <[log in to unmask]>
Rod Farmer <[log in to unmask]>
"ACM SIGCHI General Interest Announcements (Mailing List)" <[log in to unmask]>
text/plain (152 lines)

The following is the 1st Call for Papers for an Australian workshop on
"Effective Multimodal Dialogue Interfaces". 

This workshop is colocated with the 2006 International Conference on
Intelligent User Interfaces (IUI 06), Sydney, Australia.

Please refer to the details below.

Rod Farmer
[log in to unmask]

	              First Call for Papers

                           Workshop on
            Effective Multimodal Dialogue Interfaces

               to be held in conjunction with the 
    2006 International Conference on Intelligent User Interfaces
                        Sydney, Australia
                         January 29 2006 

Multimodal and speech-based interfaces are becoming more pervasive.
These interfaces have found applications in the control of intelligent
devices, information-delivery stations, and tutoring and training
systems. Further, multimodal interfaces have been implemented across a
range of devices and media, from speech-enabled touch-screens, to
PDAs, cars and mobile devices, and immersive virtual environments. As
well as speech and simple mouse-clicks, modalities may include
drawing- and writing-recognition, gesture-recognition, and haptics.

It is often claimed that such interfaces support more naturalistic and
efficient styles of interaction. However, such claims are often
accepted at face value; most metrics for evaluation focus on error and
comparison to human performance of similar interactive tasks (e.g. see
metrics for evaluating dialogue systems). Much work has recently been
carried out in linguistics, psychology, and sociology identifying and
investigating phenomena in human-human interaction that enables and
enhance successful communication and collaborative task performance.
However, it is not typically questioned whether implementing such
phenomena carries the same value for human-machine interaction.

This workshop will address the issue of evaluating multimodal dialogue
systems, and in particular the characteristics and interaction styles
that are particularly effective for human-machine collaborative task
performance. These may include features that are known to be effective
and important in human-human interaction. Conversely, it may be the
case that certain effective interaction design decisions (e.g. for
overcoming speech-recognition error) are less .natural..

The workshop encourages participation by dialogue system and HCI
researchers, interaction designers, as well as linguists,
psychologists, and sociologists interested in human-human interaction,
as well as evaluation of effective human-machine interaction.

The objectives of the workshop are to address a number of questions: 

- What makes a multimodal dialogue interface effective, and how is
  this meaningfully measured for different task domains? 
- What is the state of recent research in human-human interaction; how
  can such techniques impact the design of effective human-machine
- What interaction strategies and techniques can be used to make
  human-machine multimodal dialogue more robust? 

The targeted outcomes of the workshop include a better understanding
of how to design and build multimodal dialogue interfaces that support
successful collaborative task performance. Another important targeted
outcome is a method and set of metrics for evaluating such interfaces
and their effectiveness.

Possible topics include (but are not limited to): 

- Strategies for efficient interaction, engagement and responsiveness,
  and their role in effectiveness; 
- Methodologies and evaluation metrics for effective interaction; 
- Results from human-human interaction regarding interaction for
  effective collaboration, and their impact on system design; 
- Evaluation of effective human-system collaboration via interaction; 
- Strategies for evaluating effective task performance in multimodal
  dialogue systems; 
- Architectures and design for human-centered interactive systems; 
  Strategies for effective information generation and presentation.

The workshop will involve an interactive format, involving longer
presentations, shorter position papers and responses, and
mini-panels. We also plan for an invited presentation and a discussion
panel.  Note: all workshop participants will be required to register
both for the host conference (IUI 2006) and the workshop itself.

Submission Instructions
Submissions may be either "regular" papers of length up to 8 pages
maximum, or "position" papers of length up to 3 pages maximum. Either
type of submission should use the same formatting instructions as the
main IUI conference (i.e. ACM style); you can find links to formatting
templates at (do not use the
automatic submission system at that address).

Mail submissions to [log in to unmask] Submissions should include: 
- A separate plain text cover page with title, authors and
  affiliation, abstract, a list of keywords, and an indication as to
  whether paper is a regular or position paper; 
- A pdf file of the paper (regular or position). 

Contingent on quality of submissions, workshop proceedings will be
published in either hardcopy or electronic format.

- November 14 2005: Due date for submission 
- December 5 2005: Notification of acceptance 
- December 19 2005: Final versions of accepted papers due 
- January 29 2006: Workshop date 

Organizing Committee
Lawrence Cavedon, National ICT Australia and RMIT University,
Fang Chen, National ICT Australia 
Robert Dale, Macquarie University, Australia 
David Traum, Institute for Creative Technology (ICT) and University of
             Southern California, USA 

Program Committee

For further information contact Lawrence Cavedon ([log in to unmask]).

                To unsubscribe, send an empty email to
     mailto:[log in to unmask]
    For further details of CHI lists see