ACM SIGCHI General Interest Announcements (Mailing List)


Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Silvia Rocchi <[log in to unmask]>
Reply To:
Silvia Rocchi <[log in to unmask]>
Fri, 15 Jul 2005 10:56:16 +0200
text/plain (129 lines)
Posted on behalf of Alessandro Vinciarelli and Jean-Marc Odobez

(Apologies for multiple postings)

The submission deadline for the International Workshop on Multimodal Multiparty Meeting Processing has been extended.
The deadline is now **** July 28st **** (no more extensions will be possible).

Please see details below.






October 7
Trento, Italy

in conjunction with ICMI'05
Seventh Conference on Multimodal Interfaces. 4-6 oct. 2005.



The  interest   in  automatic  collection  and   analysis  of  meeting
recordings is constantly increasing in the research community. Current efforts try to focus not only on the single modality of speech, but to take  a broader  view  attempting to  derive  useful information  from meetings, based  on multimodal perception and understanding  of a wide array of information sources (gesture, handwriting, sketches and other manual activity, body  and head pose, eye gaze, email  leading up to a given  meeting, documents  that  are  part of  the  subject matter  or background  for  a  meeting,  agendas,  lists  of  critical  outcomes, etc.). Such a wide spectrum  of input sources gives the opportunity to explore truly  multimodal processing  approaches, which remain  a hard and  open challenge  under many  aspects. Technology  has still  to be proven effective as a mean of handling meetings, from both offline and online perspectives.

Offline processing technologies are  aimed at making meeting recording archives  a  valuable asset,  by  extracting  the  content of  meeting archives and  capitalizing on  the knowledge they  contain.  Important research efforts  are directed towards  more and more  complex content analysis algorithms  producing useful indexing  material, ranging from fact detection and extraction to analysis tasks involving higher level interpretation,  such as  participant interaction  analysis  (is there agreement  ?)   or evaluation  of  the  meeting  development (has  any decision been  made ? what is the  agenda ? did the  meeting reach the initial goals  ?).  At the same time,  sophisticated interfaces moving beyond simple  content reproduction and  allowing users to  access and use effectively data of such a complexity must be designed.

Online  processing is  aimed  essentially at  developing systems  that support  colocated  meeting  participants  activities  and/or  involve
remote  participants.   In  such  a situation,  computers  become  the
channel  of   human-human  interaction  and   represent  a  bottleneck
resulting into  non-natural feelings as well as  lack of communication effectiveness. For this reason,  many researchers have studied ways to use computational support to create collaborative environments that is at  least good  as, if  not better  than, "being  there".  The  use of multimodal  interfaces  can  address  the problem  by  conveying  more information and/or driving the attention of the users towards actually important elements.

The goal of  this workshop is to gather  researchers from the academic and  industry, active in  the above  or related  domains, in  order to acquire a  broad view of current  state-of-the-art, share experiences, exchange ideas and establish collaborations and contacts. The workshop
will  be  the  place   to  discuss  the  opportunities  and  effective
usefulness of newly  developped technologies for meeting applications.
Thus, we are looking for  position as well as research papers debating on or contributing to  the following (and other related) areas:
-  Smart meeting rooms, Meeting data collection and Annotation  tools
-  Multichannel  processing -  Multimodal identification  of intent  and emotion -  Multimodal person identification
-  Meeting dynamics and human-human interaction modeling
-  Multimodal dialogue modeling
-  Remote collaboration in meetings
-  Content abstraction, summarization and structuring
-  Multimodal indexing and retrieval


We are inviting to submit both POSITION or RESEARCH papers.

In both cases, submission should  take the form of an article (maximum
8 pages),  along with a paper  summary (maximum one page)  in the same
format.  All  submissions will be  pre-reviewed and selected  based on
contribution  to  the  workshop  topic,  originality  and  the  shared
interests of participants.

Please send  your submissions (and indicate your  paper type -research
or   position)   as  two   (paper+summary)   PDF   or   PS  files   to
[log in to unmask] before July 14,  2005.  For the format, we strongly
recommend to use ACM SIG Proceedings Templates.


Paper submission deadline: July, 28, 2005.
Acceptance notification: August, 4, 2005.
Full-version camera ready papers due to: August 24, 2005.
Workshop: October 7, 2005.


All accepted papers for the workshop  will be published in the
ICMI 2005 Workshop Proceedings,  and paper summaries will be published
in  the workshop  website.   Detailed information  about  the ACM  SIG
Proceedings Templates can be found  on     the    web site:


Hervé Bourlard, IDIAP Research Institute (Switzerland)
Trevor Darrell, Massachussets Institute of Technology (USA)
Irfan Essa, Georgia Institute of Technology (USA)
Hynek Hermansky, IDIAP Research Institute (Switzerland)
Alex Jaimes, Fuji-Xerox Research (Japan)
Iain McCowan, E-Healt Research Center (Australia)
Stanley Peters, Center for the Study of Language Information,
Stanford University (USA)
Christine Perey, Perey Research and consulting (USA/Switzerland)


Alessandro Vinciarelli
Jean-Marc Odobez

IDIAP Research Institute,
Rue du Simplon 4 PO 592,
CH 1920 Martigny, Switzerland
Phone:  +41 (0)27 721 77 24 or 26 - Fax : +41 (0)27 721 77 13.
email: [log in to unmask], [log in to unmask]

Phil Cohen

OHSU and Natural Interaction Systems, LLC (USA)
email: [log in to unmask]

                To unsubscribe, send an empty email to
     mailto:[log in to unmask]
    For further details of CHI lists see