The Perceptual User Interfaces (PUI) research group at the Max Planck
Institute for Informatics, headed by Dr. Andreas Bulling, invites applications
for a fully funded PhD studentship on pervasive gaze estimation and
attention analysis.

Computer vision is a powerful sensing modality and a key component in
state-of-the-art eye tracking systems. However, computer vision also still
represents one of its major limitations, particularly in unconstrained daily
life settings in which many of the assumptions on which existing algorithms
rely can typically not be satisfied, such as stable lighting conditions, stable
body and head position, or special-purpose and high-quality cameras.
Continuous gaze estimation throughout the day, so-called pervasive eye
tracking, promises a paradigm shift in our understanding of how the eyes can
be used as an input modality and source of information on user attention.

The proposed PhD project will develop and study image processing and
computer vision techniques for pervasive gaze estimation and attention
analysis. Potential applications are in visual behaviour monitoring, human-
computer interaction, context-aware computing and experimental psychology.
The research will be experimental, using stationary and portable cameras and
camera systems, and will involve user studies and data collection in daily
life settings. In addition to experimental skills, the work will require to
develop a thorough understanding of image processing, computer vision and
machine learning techniques suitable for pervasive gaze estimation and
attention analysis.

The research directions of particular interest include but are not limited to:

    * Real-time image processing algorithms for pupil detection, tracking and
      gaze estimation in pervasive settings

    * Computer vision and machine learning for attention analysis on situated 
      and hand-held portable devices

    * Image processing and computer vision techniques for multimodal
       human-computer interaction

The Max Planck Institute for Informatics offers a highly collegiate and
stimulating environment for doctoral research training. The successful
candidate will join a young and ambitious research group that is at the
forefront of this emerging research area. The candidate will be expected to
contribute to the strong profile of the group by participating in the
preparation and publication of research results at the level of international

We invite applications from enthusiastic individuals, who are able to work
independently and have an excellent first degree in Computer Science or
a related field relevant to the proposed research, and very good knowledge
of image processing, computer vision or machine learning for computer vision.
The studentship is not restricted by nationality. Interested applicants are
advised to consult previous work on pervasive eye tracking [1,2]. Applicants
are also strongly encouraged to initiate contact with Dr. Andreas Bulling
prior to their application.

Applicants should submit their CV, a copy of school and university degree and
course transcripts with grades (Abitur, Vor- and Hauptdiplom for German
applicants), names and contact information of two references, a description of
research interests and a short research proposal. Incomplete applications will
not be considered. Applications should be emailed to [log in to unmask]
Review of applications will start on 1st March 2013. Applications are accepted
until the position is filled.

[1] L. Swirski, A. Bulling, N. Dodgson (2012) Robust, real-time pupil tracking
in highly off-axis images, Proc. ETRA 2012: pages 173-176

[2] Y. Zhang, A. Bulling, H. Gellersen (2012) Towards pervasive gaze tracking
with low-level image features, Proc. ETRA 2012: pages 261-264

Contact information:

Dr. Andreas Bulling
Perceptual User Interfaces Group
Max Planck Institute for Informatics
Email: [log in to unmask]

    For news of CHI books, courses & software, join CHI-RESOURCES
     mailto: [log in to unmask]

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to
     mailto:[log in to unmask]

    For further details of CHI lists see