MM-INTEREST Archives

ACM SIGMM Interest List

MM-INTEREST@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show HTML Part by Default
Condense Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Message-ID:
Sender:
ACM SIGMM Interest List <[log in to unmask]>
Subject:
From:
Guillaume Chanel <[log in to unmask]>
Date:
Wed, 27 Oct 2021 16:13:30 +0200
MIME-Version:
1.0
Content-Type:
multipart/alternative; boundary="------------fR3HnW5s9d7LHlyZuj0JExfT"
Reply-To:
Guillaume Chanel <[log in to unmask]>
Parts/Attachments:
text/plain (4 kB) , text/html (21 kB)
Call for contributions

*****IUI Dyadic IMPRESSION Recognition Challenge (virtual event)*****

https://simsimpression.unige.ch/ <https://simsimpression.unige.ch/>


IMPORTANT DATES

  *

    26/10/2021: Start of the Challenge, release of training data

  *

    07/01/2022: Abstract submission (validation results) and release of
    the test data

  *

    24/01/2022: Final paper submission - End of the competition (test
    results)

  *

    09/02/2022: Notification of paper acceptance

  *

    22/03/2022: Workshop held (online)


MOTIVATION

The Dyadic IMPRESSION Recognition Challenge, to be held in March 2022 in 
conjunction with IUI 2022 in Helsinki, Finland, will be devoted to all 
aspects of artificial intelligence and behavorial science for the 
analysis of human-human interaction from multimodal data.

To advance and motivate the research on human bodily responses in dyadic 
interactions, we organize the challenge which uses the open and 
accessible multimodal IMPRESSION dataset. It addresses multimodal 
recognition as well as dynamic multi-user recognition, where both 
interlocutors’ information can be exploited.


THE CHALLENGE

The challenge aims at automatic impression recognition. This challenge 
will focus on automatic impression recognition of multiple individuals 
(i.e., the Receiver, that is, the person who forms an impression of the 
other, i.e., the Emitter) during a dyadic interaction. Self-reported 
impressions in the warmth and competence dimensions are given, 
associated with synchronized face videos, eye movements and 
physiological signals (including ECG, BVP and GSR) of both Emitters and 
Receivers.

The challenge is composed of two phases:

  *

    Development phase: public training data will be released and
    participants will develop their approaches and validate their
    predictions using a validation set;

  *

    Test (final) phase: participants will need to submit their predicted
    targets with respect to the test data, which will be released just a
    few days before the end of the challenge. We will then rank
    submissions by performance and communicate the results during the
    workshop.


The evaluation consists of computing the average concordance correlation 
coefficient (CCC) among the participants tested for the warmth and 
competence between the predicted continuous values and the continuous 
ground truth values.


All participants are invited to submit papers describing their solution 
to the challenge and following the workshop submission guidelines of the 
IUI conference.


Examples of potential submissions include (but are not limited to):

  *

    the combination of multimodal measures from either the Receiver or
    Emitter;

  *

    the computation of synchrony features between the Receiver and Emitter;

  *

    deep learning architectures which combine features from the Receiver
    and Emitter;

  *

    transfert learning approaches to extract features;

  *

    comparative studies of several approaches.



THE DATASET

The IMPRESSION dataset aims to focus on the development of automatic 
approaches to study and understand the mechanisms of perception and 
adaptation to verbal and nonverbal social signals in dyadic 
interactions, taking into account individual and dyad characteristics. 
To the best of our knowledge, there is no similar publicly available, 
nonacted face-to-face dyadic dataset in the research field in terms of 
participants, recorded sessions, and continuous impression labels in 
warmth and competence. Detailed information about the IMPRESSION dataset 
is provided on the challenge website and in the following paper:

https://archive-ouverte.unige.ch/unige:155675 
<https://archive-ouverte.unige.ch/unige:155675>


Challenge Contact email:

[log in to unmask] 
<mailto:[log in to unmask]>


Organizers:

  *

    Chen Wang, University of Geneva, Switzerland

  *

    Guillaume Chanel, University of Geneva, Switzerland

  *

    Beatrice Biancardi, LTCI, Télécom Paris, France

  *

    Chloé Clavel, LTCI, Télécom Paris, France






############################

Unsubscribe:

[log in to unmask]

If you don't already have a password for the LISTSERV.ACM.ORG server, we recommend
that you create one now. A LISTSERV password is linked to your email
address and can be used to access the web interface and all the lists to
which you are subscribed on the LISTSERV.ACM.ORG server.

To create a password, visit:

https://LISTSERV.ACM.ORG/SCRIPTS/WA-ACMLPX.CGI?GETPW1

Once you have created a password, you can log in and view or change your
subscription settings at:

https://LISTSERV.ACM.ORG/SCRIPTS/WA-ACMLPX.CGI?SUBED1=MM-INTEREST


ATOM RSS1 RSS2