CHI-ANNOUNCEMENTS Archives

ACM SIGCHI General Interest Announcements (Mailing List)

CHI-ANNOUNCEMENTS@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Shijia Pan <[log in to unmask]>
Reply To:
Shijia Pan <[log in to unmask]>
Date:
Mon, 4 Feb 2019 11:58:17 -0500
Content-Type:
text/plain
Parts/Attachments:
text/plain (112 lines)
[Apologies if you got multiple copies of this email.]
===================================================================

Call for Contesters: Aircraft Localization Competition 2019

OVERVIEW

* Date: April 15, 2019
* Place: Montreal, Canada
* Registration Deadline: February 15, 2019
* Co-located with ACM/IEEE IPSN and CPS-IoT Week 2019
* Website: https://competition.opensky-network.org
* Student Travel Grants will be available soon
* Cash Prizes up to 13.500,- EUR
* Partners/Sponsors: armasuisse, OpenSky Network, SeRo Systems
* Organisers/Committee:
  - Matthias Schäfer (TU Kaiserslautern/SeRo Systems GmbH, Germany)
  - Martin Strohmeier (University of Oxford, UK/armasuisse, Switzerland)
  - Vincent Lenders (armasuisse, Switzerland)
  - Mauro Leonardi (University of Rome Tor Vergata, Italy)
  - Fabio Ricciato (OpenSky Network, Switzerland)

GOAL OF THE COMPETITION

This competition is about finding the best methods to localize aircraft
based on crowdsourced air traffic control communication data. The data
is collected by the OpenSky Network, a large-scale sensor network which
continuously collects air traffic control data from thousands of
aircraft for research. The goal of the competition is to determine the
positions of all aircraft which do not have position reporting
capabilities or may report wrong locations. To do so, competitors will
rely on time of arrival and signal strength measurements reported by
many different sensors. Although methods like multilateration are long
known, this data poses new challenges because most of the low-cost
sensors are not time synchronised or calibrated. Competitors will
therefore have to face different kinds of noise, ranging from clock
drifts, inaccurate sensor locations, or broken timestamps due to
software bugs.

ELIGIBILITY

We encourage both individuals and teams from academia and industry to
register and participate. We strongly emphasize our openness towards
novel approaches (such as machine learning) but also allow competitors
to adapt their "traditional" localization models to the peculiarities of
the crowdsourced measurement data. The localization algorithms should be
able to produce decent results from a fresh 1h data set (~1 GB CSV) in
under 3 hours.

CATEGORIES

Competitors can choose between four levels of increasing difficulty. The
easiest category deals with data from GPS-synchronised receivers only
and the competitors are provided with the barometric altitude of the
aircraft. In this category, competitors do not have to deal with clock
drifts and can limit the effect of a bad vertical dilution of precision
by additionally considering the barometric altitude of the target.
Category 2 competitors will also have GPS-synchronised data, however, no
information about the target’s altitude will be available. Category 3
and 4 competitors will face data from both GPS-synchronised and
unsynchronised sensors. In addition, category 3 data sets will include
the barometric altitude.

COMPETITION EXECUTION

Competitors are provided with labelled training datasets which include
all aircraft location. These labelled data sets can be used by the
contesters to train their models. At the competition day, each team has
to send at least 1 team member to the conference where they will get
access to a non-labelled evaluation data sets. The teams have then 9h to
find all locations of aircraft that are missing location information in
the data sets. Every 3 hours, the teams have to submit their
intermediate results (as a CSV file) to the organizers. The organizers
will then calculate an indicator of the accuracy of their solution and
provide an intermediate ranking. After 9 hours, the teams submit their
final results and the final ranking is determined.

EVALUATION AND PRIZE

On the competition day (April 15), all teams have to submit their
intermediate results to the organizers every three hours. These
intermediate results will then be rated using an objective error metric
and the scores will be published. The teams can then continue to improve
their results, e.g., by further pre-filtering the data or improving
their models ad hoc. The leading teams of each intermediate evaluation
(i.e. after 3h and 6h) will receive increasing cash awards. The awards
will reach their maximum at the 9h deadline, when all teams have to
submit their final results. The best 3 teams of each category will also
receive cash prizes for their final results. The cash prizes are scaled
with the level of competition and the difficulty of the category. This
means that the more competitors and the higher the difficulty in a
category, the higher the prizes. The exact pricing can be found on our
website: https://competition.opensky-network.org/competition.html#awards

The teams with the best final results (after 9h) of each category will
have to present their solutions in a short presentation at the
conference. In addition, all teams who won cash awards (intermediate or
final prizes) will have to publish their code under the GNU GPLv3 on the
OpenSky Network's github account. Teams that do not want to publish
their code are not eligible for awards. This means that closed-source
solution can also compete but they will not be eligible for cash prizes.

    ---------------------------------------------------------------
    For news of CHI books, courses & software, join CHI-RESOURCES
     mailto: [log in to unmask]

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to
     mailto:[log in to unmask]

    For further details of CHI lists see http://listserv.acm.org
    ---------------------------------------------------------------

ATOM RSS1 RSS2