CHI-ANNOUNCEMENTS Archives

ACM SIGCHI General Interest Announcements (Mailing List)

CHI-ANNOUNCEMENTS@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Göbel Fabian <[log in to unmask]>
Reply To:
Göbel Fabian <[log in to unmask]>
Date:
Thu, 29 Nov 2018 10:53:12 +0000
Content-Type:
text/plain
Parts/Attachments:
text/plain (132 lines)
--- APOLOGIES FOR CROSS-POSTING ---

Second Call for Papers


ET4S 2019 - Eye Tracking for Spatial Research 2019

as a conference track at ETRA 2019, the ACM Symposium on Eye Tracking Research & Applications

June 25-28, 2019 in Denver, Colorado, USA


http://www.spatialeyetracking.org/

[log in to unmask]


Eye tracking has become a popular method for investigating research questions related to geographic space. This includes studies on how people interact with geographic information systems, studies on how space is perceived in decision situations, and using gaze as an input modality for spatial human-computer interaction. The ETRA track on Eye Tracking for Spatial Research (ET4S) aims to bring together researchers from different areas who have a common interest in using eye tracking for research questions related to spatial information and spatial decision making.

After three successful ET4S workshops in 2013, 2014 and 2018, the 4th edition of ET4S will be organized as a conference track at ETRA 2019, the ACM Symposium on Eye Tracking Research & Applications (http://etra.acm.org/2019/).

Topics of interest include, but are not limited to

   - Gaze-Based Interaction with Maps and other Spatial Visualizations
   - Evaluation of Cartographic and other Spatial Visualizations with Eye Tracking
   - Gaze-Aware Mobile Assistance and Location-Based Services
   - Navigation Studies, Wayfinding, and Eye Tracking
   - Visual Perception and Exploration of (Indoor and Outdoor) Space
   - Landscape Perception
   - Gaze During Spatial and Spatio-Temporal Decision Making
   - Eye Tracking as a Tool for Spatial Cognition Research
   - Spatio-Temporal Analysis and Visualization of Eye Tracking Data
   - Eye Tracking in Traffic Research, Car Navigation, Public Transport, and Aviation



Changes to previous ET4S events

Different to previous years, ET4S is NOT organized as a workshop, but as a track at the ACM Symposium on Eye Tracking Research & Applications. This implies the following changes:

   - We call for both, long and short papers.
   - The contribution must be equivalent to that of the according submission category at ETRA.
   - Accepted papers are considered regular ETRA publications and will be part of the ETRA proceedings (ACM digital library). A footnote on the first page will indicate that your paper was part of ET4S.
   - The submission process follows that of ETRA, but with a separate program committee.
   - At least one author of each accepted ET4S paper must register for the ETRA conference. Participants will be free to change between ET4S and other ETRA tracks.


Submission guidelines

Please follow the ETRA submission guidelines for categories long paper or short paper:

     http://etra.acm.org/2019/authors.html

During the submission process through EasyChair, please indicate that your submission should be submitted to ET4S.


Important dates (same as for ETRA)

December 14, 2018                         Paper abstracts due

December 19, 2018                         Long & short papers due

January 23, 2019                              Feedback: Reviews to authors

January 28, 2019                              Rebuttals and revised papers due

February 18, 2019                           Final notifications to authors

March 22, 2019                                Camera-ready papers due

June 25-28, 2019                              ETRA conference with ET4S track. The scheduling of the track will depend on the number of accepted ET4S papers.


Program committee

Gennady Andrienko, Fraunhofer IAIS/City University London, Germany/UK
Christina Bauer, University of Regensburg, Germany
Roman Bednarik, University of Eastern Finland, Finland
Kenan Bektas, Zurich University of Applied Sciences, Switzerland
Tanja Blascheck, University of Paris-Saclay, France
Michael Burch, Eindhoven University of Technology, Netherlands
Arzu Cöltekin, University of Zurich, Switzerland
Florian Daiber, German Research Center for Artificial Intelligence (DFKI), Germany
Beatrix Emo, ETH Zurich, Switzerland
Sara Fabrikant, University of Zurich, Switzerland
Hans Gellersen, Lancaster University, UK
Amy Griffin, Royal Melbourne Institute of Technology, Australia
Wilko Heuten, University of Oldenburg, Germany
Christophe Hurter, Ecole Nationale de l'Aviation Civile, France
Francis Jambon, University of Grenoble Alpes, France
Christian Lander, German Research Center for Artificial Intelligence (DFKI), Germany
Andrew Kun, University of New Hampshire, USA
Thomas Kübler, University of Tübingen, Germany
Bernd Ludwig, University of Regensburg, Germany
Vsevolod Peysakhovich, Federal University of Toulouse Midi-Pyrénées, France
Ken Pfeuffer, Bundeswehr University Munich, Germany
Victor Schinazi, ETH Zurich, Switzerland
Artemis Skarlatidou, University College London, UK
Sophie Stellmach, Microsoft, USA
Rul von Stülpnagel, University of Freiburg, Germany
Yanxia Zhang, FX Palo Alto Laboratory, USA



ET4S organizers

Peter Kiefer, ETH Zurich, Switzerland
Fabian Göbel, ETH Zurich, Switzerland
David Rudi, ETH Zurich, Switzerland
Ioannis Giannopoulos, TU Vienna, Austria
Andrew T. Duchowski, Clemson University, SC, USA
Martin Raubal, ETH Zurich, Switzerland



Contact

[log in to unmask]




    ---------------------------------------------------------------
    For news of CHI books, courses & software, join CHI-RESOURCES
     mailto: [log in to unmask]

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to
     mailto:[log in to unmask]

    For further details of CHI lists see http://listserv.acm.org
    ---------------------------------------------------------------

ATOM RSS1 RSS2