ACM SIGCHI General Interest Announcements (Mailing List)


Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Condense Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
"ACM SIGCHI General Interest Announcements (Mailing List)" <[log in to unmask]>
Wed, 1 Sep 2021 17:14:32 +0000
text/plain; charset="utf-8"
text/plain (80 lines)
1st DEI4EAI Workshop: Embodied AI and Gender

We are delighted to inform you that the project DEI4EmbodiedAI is officially starting!

It’s a Dutch collaboration (4TU, NIRICT) that aims to critique the existing problematic norms and provide tangible resources for practicing DEI (Diversity, Equity and Inclusion) when developing embodied AI by nurturing a DEI network among academics and societal partners. We organize a series of online or hybrid workshops (depending on Covid-19 measures) to raise awareness, assess practices and the future of our ways of working.

Here’s the call for participation in our first workshop. Participation is free.

Important Dates

Registration deadline: 14th September 2021 any time on Earth

Workshop date (half a day): 15th September 2021 14:00 CET to 18:30 CET

Register here:


The workshop is part of a series of 4 workshops promoted by the  Diversity and Inclusion for Embodied AI (DEI4EmbodiedAI) project.

State of the world. Today, embodied AI (e.g., smart objects, robots, smart personal assistants) is an expression of power: it can be used to support human flourishing through human-agent relationships, but it can also support surveillance, perpetuate bias, and amplify injustice. Recent years have seen a growing number of calls for considering gender during the design or evaluation of software, websites, or other digital technology. Research has outlined how gender plays a role in the design and use of software and other digital technology. Bias, stereotypes, and gender norms are often embedded in technology implicitly and explicitly with massive societal impact.  Designers, researchers, and societal stakeholders, all, have the responsibility to reflect on the values, perspectives, biases, and stereotypes they embed in embodied AI technology. Comparably less work has been done in the field of human-robot interaction and human-agent interaction. And, we want to take action.

Integrating DEI in the way we develop embodied AI. For example, we’ve learnt that voice assistants may not recognize certain accents, that image recognition algorithms embedded in IoTs may mislabel people based on assumed gender, and that embodied AI, like robots, can be non-inclusive in design, e.g robots with "female" voices with white bodies. These issues raise several questions about gender in the design of embodied AI: How do we make sure that we can make room for designing and developing while being mindful of the biases, stereotypes, and values we have about gender? How do we integrate these reflections in our processes rather than confining them as an afterthought? What practical actions can we take in our daily practices to integrate diversity, equity, and inclusion into the design process itself?

What is the workshop about? In this half-a-day workshop, we are going to learn more about gender in embodied AI, together with experts, artists, colleagues, and societal stakeholders. We will use methods from critical design to 1) create a hands-on understanding of our current practices and narrative 2) compile a concrete, desirable future scenario, providing practical pointers to implement design processes with diversity, equity, and inclusion in mind.

Outcomes of the workshop. We will co-shape future scenarios of DEI practices that are tangible and will help us in our everyday practices. Outcomes might be compiled in an academic publication and in a zine magazine.

Our position. We want to take action as academics, aware of the privileged role and powerful role that we have: We come from different positions of privilege and marginalization. We have had a range of experiences navigating issues related to diversity, equity and inclusion.Our experiences and outlooks cannot and do not represent everyone who shares a particular identity. We hope to engage in a meaningful conversation with the  embodied AI community at large: listening and co-creating in a spirit of reflexivity.

Target Audience:

We welcome a broad audience coming from various disciplines and practices.  Including but not limited to roboticists, human-robot interaction researchers, human-computer interaction researchers, philosophers, engineers, computer scientists, sociologists, psychologists. We welcome practitioners working in industry and non-profits. We particularly encourage citizens and societal associations interested in the topic of gender and AI to participate. We will make the workshop inclusive, making space for various voices. Every expertise is valuable.

Public Program ( please register we will send the link):
14:00 Welcome
14:30 LGBTQAI+ and AI
15:00 Keynote and Q&A: Catherine D’Ignazio (MIT)
15:45 Break
16:00 Keynote and Q&AI: Lisa Mandemaker (Designer and Artist)

Workshop for registered participants
16:30 Introduce the activities and participants and divide into groups
16:35 Our ways of working in Embodied AI: how we treat gender now
17:00 Break
17:15 Our ways of working in Embodied AI: desirable future for gender in  embodied AI
17:45 Plenary Discussion
18:15 Outro and invitation to next workshop

Dr. ir. Cristina Zaga, assistant professor at the University of Twente (EEMCS/ET Faculty)
Dr. Nazli Cila, assistant professor at the Delft University of Technology (IDE Faculty)
Dr. Maria Luce Lupetti, post-doc at the Delft University of Technology (IDE Faculty)
Dr. Minha Lee, assistant professor at the Technical University of Eindhoven (Future Everyday Group)
Dr. Gijs Huisman, assistant professor at the Delft University of Technology (IDE Faculty)
Dr. Eduard Fosch Villaronga, assistant professor at Leiden University (School of Law)

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to:
     mailto:[log in to unmask]

    To manage your SIGCHI Mailing lists or read our polices see: