ACM SIGCHI General Interest Announcements (Mailing List)


Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Condense Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
"ACM SIGCHI General Interest Announcements (Mailing List)" <[log in to unmask]>
Wed, 8 Nov 2017 15:59:08 +1100
text/plain; charset="UTF-8"
text/plain (173 lines)
[apologies for cross-postings]


The Journal of the Association for Information Science and Technology
seeks original manuscripts for a “Special issue on Conversational
Approaches to Information Retrieval
(Deadline: February 23, 2018).

*Summary and Scope*

Conversational search interfaces are increasingly common and include
intelligent mobile assistants such as Cortana, Google Now, and Siri;
intelligent home assistants such as Amazon Alexa and Google Home; and a
myriad of different software agents (or Chatbots) that users can interact
with inside messaging platforms such as Slack, Yammer, and Facebook
Workplace.  Conversational search systems are different from traditional
search systems in several ways. First, at their core, conversational search
systems aim to support multi-turn, user-machine dialogues for information
access and retrieval.  Second, some systems aim to engage users in more
naturalistic interactions, for example, by supporting spoken, natural
language information requests. Finally, some systems aim to support
multi-modal interaction, for example by allowing either textual or verbal
input and by balancing between screen and verbal output.

Prior and current research in the fields of information retrieval,
information science, and human-computer interaction is certainly relevant
to the design, development, and evaluation of conversational search
systems. From the system side, for example, prior research has focused on
improving voice query recognition and on automatically reducing verbose
queries in order to improve retrieval performance.  From the human side,
prior research has focused on understanding voice query reformulations in
response to a system error, understanding why and how users switch
modalities (e.g., textual versus spoken input), and developing methods for
intelligent assistant evaluation.

While different aspects of conversational search systems have been
investigated in prior work, many open questions remain.  How can systems
use dialogue to support information access and retrieval? How can existing
technologies such as query suggestion, results clustering, and relevant
facet prediction be used in conversational approaches to IR? What do users
want from a conversational search interface? How can a system infer user
satisfaction from conversational interactions?

In this Special Issue, we invite submissions on all aspects of
conversational approaches to information access and retrieval.  We invite
submissions addressing all modalities of conversation, including
speech-based, text-based, and multimodal interaction. We also welcome
studies of human-human interaction (e.g., collaborative search) that can
inform the design of conversational search applications.  Finally, we
welcome research on methods for evaluation of conversational IR systems.

*Topics of Interests*

Query understanding and search process management

●        Processing verbose natural language queries

●        Processing noisy ASR queries

●        Query intent disambiguation, clarification, confirmation

●        Query suggestion

●        Relevance feedback in conversational search

●        Voice-based search engine operations

●        Dialogue schema for conversational search

Search result description (presentation)

●        Audio-based search result presentation and summarization

●        Conversational navigation of search results

●        Knowledge graph presentation in conversational
Search Advertisements in audio-based search result presentation

Ranking algorithms

●        Ad-hoc spoken search

●      Spoken search in session

●      Search result diversification


●      Building test collections for conversational search

●      Development of new metrics to measure effectiveness, engagement,
satisfaction of conversational search


●        Intelligent personal assistance

●      Intelligent home assistance using voice /speech oriented devices

●      Proactive search/Recommendation

●      Collaborative search

●      Hands free search (e.g., in car, kitchen)

●      Search for visually impaired users

●      Search for low literacy users

●      Integration with existing technologies

*Submission Guidelines*

Before submitting your manuscript, please ensure you have carefully read
JASIST Submission Guidelines. The complete manuscript should be submitted
through JASIST’s Submission System. To insure that you submit to the
correct special issue, please select “Special Issue on Conversational
Approaches to Information Retrieval” as your manuscript type.

*Submission Deadlines*

Paper submission due: February 23, 2018

First round review notification: April 20, 2018

First revision due: June 15, 2018

Second round review notification: August 10, 2018

Second revision due:  September 7, 2018

Final notification: October 5, 2018

*Guest Editors*

Jaime Arguello (University of North Carolina at Chapel Hill)

Maarten de Rijke (University of Amsterdam)

Hideo Joho (University of Tsukuba)

Damiano Spina (RMIT University)

    For news of CHI books, courses & software, join CHI-RESOURCES
     mailto: [log in to unmask]

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to
     mailto:[log in to unmask]

    For further details of CHI lists see