CHI-ANNOUNCEMENTS Archives

ACM SIGCHI General Interest Announcements (Mailing List)

CHI-ANNOUNCEMENTS@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Condense Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Sender:
"ACM SIGCHI General Interest Announcements (Mailing List)" <[log in to unmask]>
X-To:
Date:
Thu, 6 Jan 2022 13:53:33 +0000
Reply-To:
Toine Bogers <[log in to unmask]>
Subject:
MIME-Version:
1.0
Message-ID:
Content-Type:
text/plain; charset="utf-8"
Content-Transfer-Encoding:
quoted-printable
From:
Toine Bogers <[log in to unmask]>
Parts/Attachments:
text/plain (104 lines)
*************************************************************************
Workshop on Information Quality in Information Interaction and Retrieval (IQIIR 2022)

March 14 2022, 9:00-17:00

This workshop is part of the ACM Conference on Human Information Interaction and Retrieval (CHIIR) 2022, taking place 14-18 March 2022 online and in Regensburg, Germany.

CHIIR 2022 website: https://ai.ur.de/chiir2022/home
*************************************************************************

************************** SCOPE **************************
The aim of the IQIIR workshop is to provide a forum where researchers from the full spectrum of CHIIR interests can come together to examine and discuss the role and impact of information quality on human-information interaction and retrieval. In light of emergent research themes at CHIIR, e.g., work on conversational search, search as learning, cognitive biases, misinformation and fake news, or fairness and transparency of retrieval results, we aim to explore the role of information quality, its assessments methods, and quality signals on end users.

Any recommendation and retrieval system aims to rank ‘the best’ items first. The notion of item quality can be handled quite differently by different systems, ranging from the objectified relevance assessments of the Cranfield paradigm, the subjective user feedback of social media, to algorithmic approaches like PageRank. While each of these methods adopt and optimize for different interpretations and aspects of ‘quality’, what is considered ‘the best’ nonetheless seems an intractable problem. Recent conceptualizations rather propose a conversationalist approach to quality, arguing that information quality can only be determined publicly and socially in shared forums. This opens a research question as to how users can be more involved in the assessment of quality and how different quality signals can inform and influence end users.

The IQIIR 2022 workshop will focus on taking stock of different perspectives on information quality along with methods for its evaluation and optimization. Towards this end, the workshop has two main goals:

  *
Collect and compare methods on the assessment of information quality
We welcome both empirical and position papers on users discussing, evaluating, tagging, or otherwise being involved in the assessment of quality and qualities. This includes investigations and reflections on review sites, collaborative writing systems, question-answer sites, discussion fora, and the like. We intend to further discuss and explore the value of such discursive, collective elements for transparent rankings.
  *
Evaluate and frame the influence of (explicit) quality signals on search

Understanding how searchers use available information in assessing resource quality is an open research question. We therefore welcome both empirical and position papers on the role of quality and explicit quality signals (i.e., knowledge-context) in searching and users’ comprehension (i.e., search-as-learning). We intend to further discuss and explore a theoretical and empirical framework for addressing these questions in the context of human-information interaction and retrieval.

We take a particular interest in perspectives that involve users and evaluations that include end users. All perspectives are welcome that fit the specifics and breadth of the CHIIR community. Relevant topics include (but are not limited to):

  *
Quality assessment methods and procedures
  *
Collective knowledge construction on Wikis
  *
Reviews and user-generated content as ranking signal
  *
Online conversations about information quality, fx. on fora and in reviews
  *
Challenges with collaborative and social quality assessments
  *
Knowledge context and quality considerations on a SERP
  *
Users’ feelings of knowing and epistemic beliefs
  *
Credibility evaluations and users' source evaluations
  *
Validation of quality assessments
  *
Algorithmic assessments of quality
  *
Democratic indexing and quality tags
  *
Detection and analysis of disinformation, hoaxes and fake news
  *
Empirical characterization of false information
  *
Reducing misinformation effects (e.g. echo-chambers, filter bubbles)
  *
User/content trustworthiness
  *
Relation between relevance (methods) and quality (aspects)


************************** SUBMISSIONS **************************
We invite two types of original contributions: empirical papers and position papers. In empirical papers, authors are invited to share novel findings, preliminary results, and post-hoc analyses. Detailed analyses or case-based studies are welcomed as well. In position papers, authors offer perspectives and/or argue for challenges, benefits, best practices, and strategies for the study of quality assessments and signals. We also invite theoretical comments, feedback, and ideas. The main goal of both types of papers is to offer arguments and cases for discussions among the presenters to probe the concepts and interplay of the presented work and positions.

All submissions should be in English and should not have been published or submitted for publication elsewhere. In the case of empirical papers, they may be based on earlier published findings and expand upon them through further and new analyses. Papers should be between 4 and 9 pages formatted in the ACM Proceedings Style and submitted via EasyChair (https://easychair.org/conferences/?conf=iqiir2022). All submissions will be peer reviewed by the Program Committee. Submissions will be published in the workshop proceedings.


********************************* FORMAT *********************************
The accepted paper contributions will be presented in themed sessions in the morning. The afternoon will then discuss the main ideas and challenges from the morning session in break-out groups. The workshop will conclude with a collaborative writing session aimed at developing a short report that inventories the different approaches to information quality by drawing on the expertise of the attendees along with identifying research challenges and opportunities.


********************************* IMPORTANT DATES *****************************
Submission deadline: January 14, 2022 (AoE)
Notification of acceptance: January 27, 2022
Camera-ready deadline: February 11, 2022
Workshop: March 14, 2022


********************************* ORGANIZERS *****************************
Frans van der Sluis (University of Copenhagen, Denmark)
Toine Bogers (Aalborg University Copenhagen, Denmark)
Florian Meier (Aalborg University Copenhagen, Denmark)
Catherine Smith (Kent State University, USA)


********************************* VENUE *********************************
The IQIIR 2022 workshop will be held on March 14, 2022, co-located with the CHIIR 2022 conference in Regensburg. Like the main conference, the IQIIR workshop will take place fully online with virtual participation only.


********************************* RELEVANT LINKS *****************************
IQIIR 2022 Workshop homepage: http://iqiir2022.aau.dk

Registration for IQIIR 2022 workshop goes via CHIIR 2022: https://ai.ur.de/chiir2022/register

Contact the organizers: [log in to unmask]<mailto:[log in to unmask]>

    ----------------------------------------------------------------------------------------
    To unsubscribe from CHI-ANNOUNCEMENTS send an email to:
     mailto:[log in to unmask]

    To manage your SIGCHI Mailing lists or read our polices see:
     https://sigchi.org/operations/listserv/
    ----------------------------------------------------------------------------------------

ATOM RSS1 RSS2