ACM SIGCHI General Interest Announcements (Mailing List)


Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
margit pohl <[log in to unmask]>
Reply To:
margit pohl <[log in to unmask]>
Mon, 23 Nov 2015 22:40:44 +0100
text/plain (55 lines)
Call for Papers: ACM IUI 2016 Workshop on Emotion and Visualization

Half day workshop at ACM IUI 2016 in Sonoma, CA, USA
March 10, 2016

  *** Paper submission deadline: December 18, 2015 ***

Interaction plays a fundamental role in visualization and visual analytics. However, the emotional states of users are rarely taken into account when developing interaction techniques and/or the visual representations themselves. In many application scenarios, it would be beneficial if emotional measurements could be visualized in order to raise user awareness of their own and others' emotional states as well as how these emotions influence their reasoning and decision making. Additionally, with the help of emotional data, certain visualization systems should be able to adapt to the users’ mental models, cognitive processes and emotional states in terms of interaction and visual representation in order to support the analysis and exploration process.

Our goal is to foster this novel area of visualization that takes user affective states into consideration. The EmoVis 2016 workshop is co-located with the ACM International Conference on Intelligent User Interfaces (ACM IUI 2016) and welcomes researchers, practitioners and experts from a variety of scientific domains, including visualization, human-computer interaction, artificial intelligence, cognitive psychology, and multimedia. This workshop will act as a forum where people with diverse backgrounds can present design principles and introduce novel techniques for affect measurement and visualization.

We invite submissions that explore user emotional states in context of interactive visualization. Topics for submissions include but are not limited to the following areas:

* emotion measurement devices and technologies, such as BCIs or smart wristbands
* emotion detection algorithms, e.g., based on user physiology, user language/intonation or written text like Twitter posts
* emotion visualization for individuals, groups or crowds
* emotion visualization for supporting social interaction
* capturing or generating engagement at large-scale events
* awareness of emotions in collaboration or analysis tasks
* emotion-adaptive visualizations capable of determining and reacting to user emotional states 
* user studies and evaluation techniques for emotion visualization and emotion-adaptive visualizations

Workshop papers accepted for EmoVis 2016 will be published in a joint CEUR proceedings ( Submissions may be full research papers (maximal 8 pages) or shorter papers (maximal 4 pages) which may describe position statements, novel ideas, work-in-progresses, or recent results related to the topics of emotion and visualization. The authors should clearly indicate the submission type at the end of the abstract in order to help reviewers better understand their contributions. We select the papers according to novelty and quality on the one hand side, but also based on their potential to contribute to the workshop discussions. More information on how to submit a paper can be found at 


EmoVis 2016 Co-Chairs

* Andreas Kerren (Linnaeus University, Sweden)
* Daniel Cernea (AGT International, Germany)
* Margit Pohl (Technical University of Vienna, Austria) 

Important Dates

* Paper submission deadline: December 18, 2015
* Author notification: January 22, 2016
* Camera-ready paper deadline: February 5, 2016 

If you have any questions, please contact the workshop organizers: [log in to unmask]

    For news of CHI books, courses & software, join CHI-RESOURCES
     mailto: [log in to unmask]

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to
     mailto:[log in to unmask]

    For further details of CHI lists see