CHI-ANNOUNCEMENTS Archives

ACM SIGCHI General Interest Announcements (Mailing List)

CHI-ANNOUNCEMENTS@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Condense Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Content-Type:
text/plain; charset="UTF-8"
Date:
Fri, 9 Nov 2018 14:55:03 -0500
Reply-To:
Myounghoon Jeon <[log in to unmask]>
Subject:
MIME-Version:
1.0
Message-ID:
Content-Transfer-Encoding:
quoted-printable
Sender:
"ACM SIGCHI General Interest Announcements (Mailing List)" <[log in to unmask]>
From:
Myounghoon Jeon <[log in to unmask]>
Parts/Attachments:
text/plain (147 lines)
Dear CHI Community,

(Apologies for cross-postings)
Please consider submitting your work:

JOURNAL ON MULTIMODAL USER INTERFACES
Springer Journal


*Auditory Displays and Auditory User Interfaces*

*Art-Design-Science-Research*



*Guest Editors:*

Myounghoon Jeon, Virginia Tech, USA ([log in to unmask])

Areti Andreopoulou, University of Athens, Greece (
[log in to unmask])

Brian FG Katz, Sorbonne University, France (
[log in to unmask])



Deadline for paper submission: May 1, 2019



This special issue concerning Auditory Displays and Auditory User
Interfaces: Art-Design-Science-Research (ADSR) is motivated by the theme of
the 2018 Conference of ICAD (International Community for Auditory Display),
a wordplay on the term “ADSR” (Attack-Decay-Sustain-Release), commonly used
in sound-related domains. This is an open call to authors for contributions
under this broad theme. We welcome technical, theoretical, and empirical
papers that can contribute to any aspect (art, design, science, and
research) of auditory displays and auditory user interfaces.



Designers and researchers have tried to make auditory displays and auditory
user interfaces more useful in numerous areas, extending typical
visuo-centric interactions to multimodal and multisensorial systems.
Application areas include education, assistive technologies, auditory
wayfinding, auditory graphs, speech interfaces, virtual and augmented
reality environments, and associated perceptual, cognitive, technical, and
technological research and development. *Research through design* or *embedded
design research* has recently become more pervasive for auditory display
designers. Hence, we welcome all types of “design” activities as a
necessary process in auditory display. In addition, methodical evaluation
and analysis have become more prominent, leading to more robust science. In
this iterative process, auditory displays can achieve improved reliability
through robust and repeatable research. In some areas, we have already
arrived at the *science*stage, while in other areas we are still exploring
the possibilities.



Pursuing novelty encourages artists to seek the integration of different
genres and transform modalities. By definition, auditory displays and
sonification *transform* data into sound. Thanks to the characteristics of
this transformation, there have been active interactions between auditory
displays and various forms of art. Hence, we would also like to invite
contributions addressing artistic approaches to auditory displays and
auditory user interfaces.

Rather than insisting on a specific approach, we encourage a broad spectrum
of diverse strategies. In other words, all these approaches – art, design,
science, and research – should be balanced and utilized more flexibly
depending on the circumstances.


*T**opics of interest* for this special issue include, but are not limited
to, the following:

   - Multimodal user interfaces (auditory +
   visual/haptic/tactile/olfactory/gustatory, etc.)
   - Auditory displays inspired by music or other forms of art
   - Culture-specific auditory displays
   - Speculative, aspirational prototype designs, case studies, and
   real-world applications
   - Aesthetics of auditory displays and auditory user interfaces
   - Auditory display design paradigm, theory, and taxonomy
   - Design methods, processes, tools, and techniques
   - Users, experiences, and contexts of auditory displays and auditory
   user interfaces
   - Development of new sensors, devices, or platforms for auditory
   displays and auditory user interfaces
   - Accessibility, inclusive design, and assistive technologies
   - Computational/algorithmic approaches
   - Human Factors, Ergonomics and Usability
   - Auditory displays with a focus on spatial/3D sound
   - Sonification in Health and Environmental Data (soniHED)
   - Sonification in the Internet of Things, Big Data, or Cybersecurity
   - Sonification in vehicles



*Schedule:*

Submission deadline:  May 1, 2019

1st round review:         August 1, 2019

2nd round review:        October 1, 2019

Publication:                 Spring 2020



*Authors Instructions:*

Submissions should be 8-12 pages long, presenting original unpublished
work. Previously presented conference and workshop papers should include a
minimum of 30% new content. The authors will be required to follow the
Author’s Guide for manuscript submission to the Journal of Multimodal User
Interfaces published by Springer.

(http://www.springer.com/computer/hci/journal/12193)


During the submission process, please select “S.I.: Auditory Display 2018”
as article type.


************************************************
Myounghoon "Philart" Jeon, Ph.D.
Associate Professor
*Mind Music Machine Lab <http://ise.vt.edu/philart>*
Grado Department of Industrial and Systems Engineering
Virginia Tech
519D Whittemore Hall
1185 Perry St. Blacksburg, VA 24061
[log in to unmask]

    ---------------------------------------------------------------
    For news of CHI books, courses & software, join CHI-RESOURCES
     mailto: [log in to unmask]

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to
     mailto:[log in to unmask]

    For further details of CHI lists see http://listserv.acm.org
    ---------------------------------------------------------------

ATOM RSS1 RSS2