CHI-ANNOUNCEMENTS Archives

ACM SIGCHI General Interest Announcements (Mailing List)

CHI-ANNOUNCEMENTS@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
zakia hammal <[log in to unmask]>
Reply To:
zakia hammal <[log in to unmask]>
Date:
Tue, 2 Jun 2015 16:30:15 +0100
Content-Type:
text/plain
Parts/Attachments:
text/plain (161 lines)
************************************************
CFP - Apologies for multiple copies
************************************************

The First International Workshop on Modeling INTEPERsonal SynchrONy - INTERPERSNAL@ICMI2015
(http://interpersonalicmi2015.isir.upmc.fr)

@the17th International Conference on Multimodal Interaction (ICMI 2015)
(http://icmi.acm.org/2015/)


_______
SCOPE
_______

Understanding human behavior through computer vision and signal
processing has become of major interest with the emergence of social
signal processing and affective computing andtheir applications to
human-computer interaction. With few exceptions, research has focusedon
detection of individual persons, their nonverbal behavior in the context
of emotion and related psychosocial constructs. With advances in
methodology, there is increasing interest inadvancing beyond the
individual to social interaction of multiple individuals. This level of
analysis brings to the fore detection and understanding of interpersonal
influence and interpersonal synchrony in social interaction.

Interpersonal synchrony in social interaction between interactive
partners is the dynamic andreciprocal adaptation of their verbal and
nonverbal behaviors. It affords both a novel domain for computer vision
and machine learning, as well as a novel context with which to examine
individual variation in cognitive, physiological, and neural processes
in the interacting members. Interdisciplinary approaches to
interpersonal synchrony are encouraged. Investigating these complex
phenomena has both theoretical and practical applications.

The proposed workshop will explore the challenges of modeling,
recognition, and synthesis of influence and interpersonal synchrony. It
will address theory, computational models, and algorithms for the
automatic analysis and synthesis of influence and interpersonal
synchrony. We wish to explore both influence and interpersonal synchrony
in human-human and human-machine interaction in dyadic and multi-person
scenarios. Expected topics include definition of different categories of
interpersonal synchrony and influence, multimodal corpora annotation of
interpersonal influence, dynamics of relevant behavioral patterns, and
synthesis and recognition of verbal and nonverbal patterns of
interpersonal synchrony and influence.The INTERPERSONAL workshop will
afford opportunity for discussing new applications such as clinical
assessment, consumer behavior analysis, and design of socially aware
interfaces.

The INTERPERSONAL workshop will identify and promote research challenges
relevant tothis exciting topic of synchrony.

______________
LIST OF TOPICS
______________
We encourage papers and demos addressing, but not limited to, the following research topics:

- Theoretical approaches to interpersonal synchrony in human/human and human/machine interaction
- Analysis and detection of non-verbal patterns of interpersonal synchrony/influence
- Models taking into account the relatioship between influence and synchrony
- Analysis and detection of physiological signals
- Modeling interpersonal synchrony in dyadic and in multi-party social interaction
- Psychological correlates of interpersonal synchrony/influence
- Analysis and detection of functional roles, persuasion, trust, dominance and so on
- Recording and annotation of corpora that vary in degree of experimental control
- Qualitative and quantitative evaluation
- Design of social agents and dialog systems.

_________________________
SUBMISSIONS AND REVISIONS
_________________________

Long paper:  8 pages maximum in the two-column ACM conference format.
Accepted long papers will be presented as long talk or a poster.
Short paper: 4 pages maximum in the two-column ACM conference format.
Accepted short papers will be presented as either a short talk or a
poster.

Submissions should include: title, author(s), affiliation(s), e-mail
address(es), tel/fax number(s), and postal address(es).

The papers have to be submitted at the following link:


https://easychair.org/conferences/?conf=interpersonalicmi201


All the contributions will be subject to a peer-review by at least three
reviewers from the Program Committee.

__________
DEADLINES
__________

July 20th, 2015: Submission deadline

August 4th, 2015: Notification of acceptance

August 17th, 2015: Camera ready version due to electronic form

November 13th, 2015: 2015 INTERPERSONAL@ICMI2015 Workshop


______________
ORGANIZATION
______________

Mohamed Chetouani,
Institute for Intelligent Systems and Robotics,
University Pierre and Marie Curie, Paris, France
(mohamed.chetouani at upmc.fr)

Giovanna Varni,
Institute for Intelligent Systems and Robotics,
University Pierre and Marie Curie, Paris, France
(varni at isir.upmc.fr)

Hanan Salam,
Institute for Intelligent Systems and Robotics,
University Pierre and Marie Curie, Paris, France
(salam at isir.umpc.fr)

Zakia Hammal
Robotics Institute
Carnegie Mellon University
(zakia_hammal at yahoo.fr)

Jeffrey F. Cohn
University of Pittsburgh
Robotics Institute
Carnegie Mellon University
(jeffcohn at cs.cmu.edu)

______________
SPONSORS
______________

This workshop is partially supported by the Laboratory of Excellence SMART (http://www.smart-labex.fr)



Zakia Hammal, PhD
The Robotics Institute, Carnegie Mellon University
http://www.ri.cmu.edu/

Human-Machine Interaction
Facial Expression Recognition
Visual Perception
http://www.pitt.edu/~emotion/ZakiaHammal.html 

    ---------------------------------------------------------------
    For news of CHI books, courses & software, join CHI-RESOURCES
     mailto: [log in to unmask]

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to
     mailto:[log in to unmask]

    For further details of CHI lists see http://listserv.acm.org
    ---------------------------------------------------------------

ATOM RSS1 RSS2