CHI-ANNOUNCEMENTS Archives

ACM SIGCHI General Interest Announcements (Mailing List)

CHI-ANNOUNCEMENTS@LISTSERV.ACM.ORG

Options: Use Classic View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Topic: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Olana Missura <[log in to unmask]>
Sun, 22 May 2016 07:31:05 -0400
text/plain (108 lines)
ECML/PKDD 2016 Discovery Challenge: The 3rd challenge is online at
http://alt.qcri.org/ecml2016/!cQA Challenge: Learning to Re-Rank Questions
for Community Question Answering

Due to the extended use of Web forums, such as Yahoo! Answers or
Stackoverflow, there has been a renewed interest in Community Question
Answering (cQA). cQA combines traditional question answering with a modern
Web scenario, where users pose questions hoping to get the right answers
from other users. The most critical problem arises when a new question is
asked in the forum. If the user's question is similar (even semantically
equivalent) to a previously posted question, she/he should not wait for
answers or for another user to address her/him to the relevant thread
already archived in the forum. An automatic system can search for
previously-posted relevant questions and instantaneously provide the found
information.

In this challenge, given a new question and a set of questions previously
posted to a forum, together with their corresponding answer threads, a
machine learning model must rank the forum questions according to their
relevance against the new user question.

Even if this task involves both Natural Language Processing (NLP) and
Information Retrieval, the challenge focuses on the machine learning
aspects of reranking the relevant questions. Therefore, we provide both the
initial rank and the feature representation of training and test examples
to the participants. We extract features from the text of the user and
forum questions using advanced NLP techniques, e.g., syntactic parsing.
Most interestingly, we also provide the Gram matrices of tree kernels
applied to advanced structural tree representation. A few other features
express the relevance of the thread comments, associated with the forum
questions, against the user question.

Participants are expected to exploit these data for building novel and
effective machine learning models for reranking the initial question list
in a better rank according to Mean Average Precision (MAP).
Challenge Website

http://alt.qcri.org/ecml2016/
OrganizersDiscovery Challenge Chairs

   -

   Elio Masciari, ICAR CNR, Italy
   -

   Alessandro Moschitti, Qatar Computing Research Institute, HKBU (Prof. at
   the University of Trento, Italy)

cQA Challenge Chairs

   -

   Alberto Barrón-Cedeño, Qatar Computing Research Institute
   -

   Giovanni Da San Martino, Qatar Computing Research Institute
   -

   Simone Filice, Università degli Studi di Roma "Tor Vergata"
   -

   Preslav Nakov, Qatar Computing Research Institute

Prizes

Prizes will be awarded to the two best performing teams:

   -

   1,000€ to the winner on the test set;
   -

   500€ to the winner on the development set.
   -

   If the same team wins on both sets, the €500 go to the first runner up
   on the test set.

Important dates

Release of the training and development sets: Thursday, May 12, 2016

Opening of the online oracle for submissions on the development set: Monday,
May 16, 2016

Registration deadline: Friday, July 22, 2016

End of submission period on the development set: Friday, July 22, 2016

Release of the test set: Saturday, July 23, 2016

End of submission period on the test set: Saturday, July 30, 2016

Winner announcement: Monday, August 1, 2016

Deadline for system description report submission (selected only): Sunday,
August 7, 2016

    ---------------------------------------------------------------
    For news of CHI books, courses & software, join CHI-RESOURCES
     mailto: [log in to unmask]

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to
     mailto:[log in to unmask]

    For further details of CHI lists see http://listserv.acm.org
    ---------------------------------------------------------------

ATOM RSS1 RSS2