CHI-ANNOUNCEMENTS Archives

ACM SIGCHI General Interest Announcements (Mailing List)

CHI-ANNOUNCEMENTS@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Thomas Pederson <[log in to unmask]>
Reply To:
Thomas Pederson <[log in to unmask]>
Date:
Tue, 8 Sep 2020 14:09:20 +0000
Content-Type:
text/plain
Parts/Attachments:
text/plain (28 lines)
Application deadline: October 18th 2020.
Start: January 1st 2021 or according to agreement.


Augmented Reality-based Training and Guidance for production worker teams
-------------------------------------------------------------------------
Open fully funded PhD student project in Production Technology at University West, Sweden.

The doctoral project focuses on exploring how emerging wearable interaction technologies in the shape of Augmented Reality (AR) glasses can help improve ergonomy, quality, and efficiency in existing workflows characterized by collaboration and knowledge sharing a) across individuals working in production worker teams and b) across individuals and supporting semi-automated digital systems providing just-in-time guidance in different phases of the production process.

While the project has clear connections to the area of Computer-Supported Cooperative Work (CSCW), the focus is on Artificial Intelligence(AI)-driven Human-Computer Interaction (HCI). The project aims at defining a new interaction paradigm for subtle just-in-time communication between on the one hand a) an AI system monitoring work progress through wearable and embedded sensors and augmenting the environment with more or less subtle cues (e.g. shown on the AR display) when appropriate, and on the other hand b) production workers largely free to modify and exchange experience and knowledge about continuously evolving work procedures with each other. The long-term goal of the intended AR-based AI system is the ability to acknowledge, adapt to, and support this kind of work integrated learning within and across teams through different kinds of assistance mechanisms.

The focus of the project is the identification of subtle, implicit, ways for the AR system to provide guidance and information in general. Emerging AR glasses with incorporated gaze tracking solutions offer a unique opportunity for subtle gaze direction (Bailey et al., 2009) and is expected to be an important system component to enable seamless guidance although other methods will also be explored. This aspect of the project demands a consideration of human perception and in particular attention mechanisms. Essentially, the project is expected to contribute to a better understanding of how interactive systems could interface to unconscious cognitive processes, a largely unexplored area within HCI.

Qualifications of merit for this position: To be successful in this project the applicant should be highly motivated, possess a structured way of working, and have good communicative and collaborative skills. The applicant should have excellent skills in software/hardware prototyping, experience from using quantitative and qualitative methods for evaluating interactive systems, and a will to acquire deep understanding of human perception and cognition. Previous experience from (designing for) construction work industry workflows is valuable, as is formal education in HCI, interaction design, and/or AI at university level. Good ability to communicate both verbally and in writing in English is also a merit.

more info: https://hv.varbi.com/se/what:job/jobID:350454/type:job/where:4/apply:1

    ---------------------------------------------------------------
    For news of CHI books, courses & software, join CHI-RESOURCES
     mailto: [log in to unmask]

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to
     mailto:[log in to unmask]

    For further details of CHI lists see http://listserv.acm.org
    ---------------------------------------------------------------

ATOM RSS1 RSS2