ACM SIGCHI General Interest Announcements (Mailing List)


Options: Use Classic View

Use Monospaced Font
Show Text Part by Default
Condense Mail Headers

Topic: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Sender: "ACM SIGCHI General Interest Announcements (Mailing List)" <[log in to unmask]>
Date: Thu, 25 Oct 2007 09:28:33 +0000
Reply-To: Ian Oakley <[log in to unmask]>
MIME-Version: 1.0
Content-Transfer-Encoding: quoted-printable
Content-Type: text/plain; charset=utf-8
From: Ian Oakley <[log in to unmask]>
Comments: RFC822 error: <W> Invalid RFC822 field - "Call fo=". Rest of header flushed.
Parts/Attachments: text/plain (47 lines)
2nd International Workshop on Haptic and Audio Interaction Design
Call for Participation 
Early registration: until 27th October 2007 (100 USD)
Pre registration: until 23rd November 2007 (150 USD)
Workshop dates: 29-30 November 2007 in Seoul, Korea

We apologise if you receive multiple copies of this notice

Technologies to enable multimodal interaction are now sufficiently  mature that research is turning away from pure technology development and  looking towards interaction and design issues. Robust solutions exist to  display audio and haptic feedback in many forms - for instance as  speech and non speech sounds and through tactile and force feedback  sensations. Furthermore, it has been demonstrated that the novel interactions  supported by these modalities can confer benefits for all users.  However, many questions remain: how can we design effective haptic, audio and  multimodal interfaces? In what new application areas can we apply  these techniques? Are there design methods that are useful? Or evaluation  techniques that are particularly appropriate?

While multimodal interfaces are attracting more and more attention,  there is relatively little work on how the haptic and auditory modalities  can be efficiently and effectively combined in an interface. Is there  information which is better communicated using one modality rather than  another? How can we link haptic and auditory displays so that changes  in one modality are reflected in the other? Can we create complimentary  relationships between the information displayed to each sense?  Additionally, how should we interact with these new displays and interfaces? Is  a direct manipulation interaction style still appropriate? A technique  that works well with a force feedback device but may not be  appropriate for all types of displays. How should we interact with a tactile  display, or manipulate a sonified graph?

Whilst audio and haptic interaction has been shown to be a useful tool,  neither sense has the bandwidth of the visual modality. Careful,  considered and informed interaction design will play a vital role if  multimodal systems are to move beyond the lab and into the real world. This  workshop seeks novel research addressing this human-centric challenge. 

The final program features keynotes from James Ballas of the US Naval Research Laboratories and Dong-Soo Kwon of KAIST, a full program of peer reviewed papers (to be published as an issue of Springer's LNCS) and a lively demo and poster session. Check the webpage for full details:

Important Dates
  - 27th September - 26th October 2007: Early registration (100 USD)
  - 27th October  - 23rd November 2007: Pre-registration (150 USD)
  - 29th-30th November 2007: Haptic and Audio Interaction Design  Workshop, onsite registration (200 USD)

The conference banquet (to be held on 29th November 2007) costs an additional 50 USD.

The 2nd International Workshop on Haptic and Audio Interaction Design  will be held in Seoul, Korea.

Program Co-Chairs
Ian Oakley (ETRI, Korea)
Stephen Brewster (University of Glasgow, UK)

Ian Oakley and Stephen Brewster
Haptic and Audio Interaction Design'07 Program Co-Chairs
email: [log in to unmask]

                To unsubscribe, send an empty email to
     mailto:[log in to unmask]
    For further details of CHI lists see