CHI-ANNOUNCEMENTS Archives

ACM SIGCHI General Interest Announcements (Mailing List)

CHI-ANNOUNCEMENTS@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Condense Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Content-Type:
text/plain; charset="UTF-8"
Date:
Mon, 23 Aug 2021 13:42:53 -0500
Reply-To:
Matt Lease <[log in to unmask]>
Subject:
MIME-Version:
1.0
Message-ID:
Content-Transfer-Encoding:
quoted-printable
Sender:
"ACM SIGCHI General Interest Announcements (Mailing List)" <[log in to unmask]>
From:
Matt Lease <[log in to unmask]>
Parts/Attachments:
text/plain (77 lines)
*CSCW 21 Workshop - Investigating and Mitigating Biases in Crowdsourced
Data <https://sites.google.com/view/biases-in-crowdsourced-data>*

The workshop will be held at ACM CSCW 2021, virtually on the *23rd of
October 2021* from 3 PM to 8 PM EDT.

The workshop will explore how specific crowdsourcing workflows, worker
attributes, and work practices contribute to biases in data. We also plan
to discuss research directions to mitigate labelling biases, particularly
in a crowdsourced context, and the implications of such methods for the
workers.

We invite participants to take part in the workshop challenge and/or submit
a position paper.

   - Submit a short position paper by *10 September 2021*
   - Register for the Crowd Bias Challenge by 23 September 2021

*Workshop Themes*


   - Understanding how annotator attributes contribute to biases

Research on crowd work has often focused on task accuracy whereas other
factors such as biases in data have received limited attention. We are
interested in reviewing existing approaches and discussing ongoing work
that helps us better understand annotation attributes contributing to
biases.


   - Quantifying bias in annotated data

An important step towards bias mitigation is detecting such biases and
measuring the extent of biases in data. We seek to discuss different
methods, metrics and challenges in quantifying biases, particularly in
crowdsourced data. Further, we are interested in ways of comparing biases
across different samples and investigating if specific biases are
task-specific or task-independent.


   - Novel approaches to mitigate crowd bias

We plan to explore novel methods that aim to reduce biases in crowd
annotation in particular. Current approaches range from worker
pre-selection, improving task presentation and dynamic task assignment. We
seek to discuss shortcomings and limitations of existing and ongoing
approaches and ideate future directions.


   - Impact on crowd workers

We want to explore how bias identification and mitigation strategies can
impact the actual workers, positively or negatively. For example, workers
in certain groups may face increased competition and lack of task
availability. Collecting worker attributes and profiling could raise
ethical concerns.



More details at https://sites.google.com/view/biases-in-crowdsourced-data

-- 
Matt Lease
Associate Professor
School of Information
University of Texas at Austin
Voice: (512) 471-9350 · Fax: (512) 471-3971 · Office: UTA 5.536
http://www.ischool.utexas.edu/~ml

    ----------------------------------------------------------------------------------------
    To unsubscribe from CHI-ANNOUNCEMENTS send an email to:
     mailto:[log in to unmask]

    To manage your SIGCHI Mailing lists or read our polices see:
     https://sigchi.org/operations/listserv/
    ----------------------------------------------------------------------------------------

ATOM RSS1 RSS2