CHI-ANNOUNCEMENTS Archives

ACM SIGCHI General Interest Announcements (Mailing List)

CHI-ANNOUNCEMENTS@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Prasant Misra <[log in to unmask]>
Reply To:
Prasant Misra <[log in to unmask]>
Date:
Tue, 20 Mar 2018 21:55:33 +0530
Content-Type:
text/plain
Parts/Attachments:
text/plain (83 lines)
*--------------------------------------------------------------------------------------------------------------------------------------------------IoPARTS
2018 : Call for PapersThe 1st International Workshop on Internet of People,
Assistive Robots and Thingsheld in conjunction with ACM Mobisys 2018
--------------------------------------------------------------------------------------------------------------------------------------------------Dear
Colleagues, We invite original submissions for IoPARTS 2018: the 1st ACM
International Workshop on Internet of People, Assistive Robots and Things,
held in conjunction with ACM Mobisys 2018
(https://www.sigmobile.org/mobisys/2018
<https://www.sigmobile.org/mobisys/2018>) on 10 June 2018 in Munich,
Germany. Our sincere apologies if you receive multiple copies of the
call. IoPARTS 2018 invites researchers, engineers,  and subject matter
experts from academia and industry to present their latest innovations,
which address coworking challenges between humans and robots within the
context of IoT. It stems from the observation that rapid advances in
sensing, sensor algorithms, miniaturized computing, and improved actuation
are expeditiously bringing robots into our daily lives. The near future
holds promise of driverless cars, package delivering drones, precision
agriculture, co-working robots, and others. A major paradigm change with
these applications is the degree of autonomy of the robot, and the close
proximity of their operation in urban environments. This is dramatically
different from prior deployments of automation in factories with very
structured environments constructed for their operation. This change brings
several challenges arising from close proximity of operation with humans,
modality of interaction, general awareness, and others. This workshop is
interested in bringing out the computer systems challenges that this future
paradigm presents. Such systems challenges include spatial awareness of the
robot, ability to react in a timely manner, faster and better modes of
interaction, and systems/protocols for seamless interaction in diverse
environments.  Topics of interest include, but not limited to : - Real-time
software design for mixed-criticality systems- Smart space design for
human-robot systems - System architectures and protocols for interacting
with unmanned autonomous vehicles- Localization, navigation, and dynamic
path planning- Novel sensing and interface technologies for human-robot
interaction- Algorithms  for human-robot coordination - Joint inference and
learning based on data generated by human wearables, fixed sensors and
robots- Human-in-the-loop algorithms for robot sensing/actuation-
Verification of hybrid human-robot systems- Improved
communication/visualization between robot and humans- Energy management
among robots and mobile/fixed sensors- Issues on heterogeneous sensor
modalities and capabilities among humans and robots- Security/Privacy
challenges of human-robot systems- Ethical/social challenges of human-robot
systems- Results from prototypes, test-beds, and demonstrations- Novel
applications of human-robot systemsTwo types of submissions are solicited:
- Full papers: Maximum length of 6 pages, including title, author list,
abstract, all figures, tables, and references. At least one author of each
accepted paper must register for the workshop and present the paper.-
Vision abstracts: Extended abstract that offers a future vision for a
research direction in this space, of maximum length of 2 pages. The
abstract should include title, author list, narrative (the vision
statement), and references. At least one author of each accepted abstract
must register for the workshop and participate in a future visions session.
The session will include short talks by author of accepted abstracts,
followed by discussion. Submission Guidelines: - IoPARTS invites submission
of original work not previously published, or under review at another
conference or journal. - Submissions (including title, author list,
abstract, all figures, tables, and references) must be no greater than 6
pages in length for Full papers and must be no greater than 2 pages in
length for Vision abstracts. - Papers must be in PDF format, and should be
submitted through the submission website: https://ioparts18.hotcrp.com/
<https://ioparts18.hotcrp.com/>.- Reviews will be single-blind: authors
name and affiliation should be included in the submission.- All submissions
must follow the formatting guidelines as given on
 https://www.sigmobile.org/mobisys/2018/submission/
<https://www.sigmobile.org/mobisys/2018/submission/>; and those that do not
meet the size and formatting requirements will not be reviewed. - Accepted
papers will be published by ACM. Important Deadline : - Submission Due: 31
March 2018, 11:59 PM AOE - Notification of Acceptance: 20 April 2018-
Camera Ready Due: 04 May 2018- Workshop : 10 June 2018 Please do not
hesitate to reach out to [log in to unmask] <[log in to unmask]> or
[log in to unmask] <[log in to unmask]> with any questions
regarding the workshop. Sincerely,Prasant Misra, Niki Trigoni, Sen Wang,
 Karthik Dantu and Hongkai Wen (Organising Co-Chairs)Thanks.Prasant Misra*

    ---------------------------------------------------------------
    For news of CHI books, courses & software, join CHI-RESOURCES
     mailto: [log in to unmask]

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to
     mailto:[log in to unmask]

    For further details of CHI lists see http://listserv.acm.org
    ---------------------------------------------------------------

ATOM RSS1 RSS2