CHI-ANNOUNCEMENTS Archives

ACM SIGCHI General Interest Announcements (Mailing List)

CHI-ANNOUNCEMENTS@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
"M. Larson" <[log in to unmask]>
Reply To:
M. Larson
Date:
Sun, 3 May 2015 13:15:54 -0700
Content-Type:
text/plain
Parts/Attachments:
text/plain (104 lines)
**************************************
Call for Papers
CrowdRec 2015
http://crowdrecworkshop.org
ACM RecSys 2015 Workshop on Crowdsourcing and Human Computation for 
Recommender Systems
Submission deadline: June 29, 2015
**************************************

CrowdRec 2015 will be held at the 9th ACM Conference on Recommender Systems
Vienna, Austria, 16-20 September 2015
http://recsys.acm.org/recsys15

Human computation is the application of human intelligence to solve 
problems that computers cannot yet solve. Crowdsourcing scales up the 
power of human intelligence by calling on a large number of human 
contributors, referred to as the Crowd.

Recently, many areas of research have awakened to the potential of 
techniques that gather input from human contributors. However, the 
opportunities are particularly promising for recommender systems, whose 
reliance on expressions of human preference, e.g., ratings, in huge 
quantities already qualifies them as crowd-driven technology. In 
focusing heavily on human preference, however, today’s recommender 
systems fall short of the benefits of actively integrating the full 
potential of human intelligence.

The purpose of the CrowdRec workshop is to provide a forum for exchange 
and discussion on how human intelligence and crowd techniques can be 
used to improve recommender systems.

A wide range of possibilities exists for effectively collecting 
intelligent input from humans and for incentivizing the Crowd to make 
specific contributions. Collection of input can occur in social 
communities, via large online crowdsourcing platforms such as Mechanical 
Turk, or by way of a variety of applications that use principles of 
gamification to engage users. Crowdmembers can directly contribute 
information (such as comments and reviews), can validate information 
(such as tags or descriptions), or can provide feedback on recommender 
system design or performance. At present, however, the Crowd remains 
notoriously difficult to exploit effectively. The challenge arises from 
the complexity of user and crowdmember communities. Such groups 
constitute dynamic systems that are highly sensitive to changes in the 
form and the parameterization of their activities. A thorough 
understanding of how best to present tasks to the Crowd, and to make use 
of intelligent input, will be crucial in recommender systems to benefit 
from crowdsourcing and human computation.

The CrowdRec Workshop encourages contributions focusing on new 
approaches, new concepts, new methodologies and new applications that 
combine human computation/crowdsourcing with conventional recommender 
systems. Topics include, but are not limited to, the following:

Human Contributions beyond the User-Item Matrix
- Applications and interfaces for collecting annotations,
- Games With A Purpose (GWAP) or other annotation-as-by-product designs,
- Effective Learning from crowd-annotated or crowd-augmented datasets,
- Mining social media to support recommendation,
- Conversational recommender systems,
- Wisdom of the Crowd for decisions support.

Designing and Evaluating Recommenders using Crowd Techniques
- Recommender evaluation metrics and studies,
- Crowd-based user studies,
- Human intelligence for personalization support,
- User modeling and profiling.

Methodologies for Human Intelligence in Recommender Systems
- Identifying expertise and managing reputation,
- Engaging crowdmembers and ensuring quality,
- Tools and platforms to support crowd-enhanced Recommender Systems,
- Inherent biases, limitations and trade-offs of crowd-powered approaches,
- Empirical and case studies of crowd-enhanced recommendation,
- Ethical, cultural and policy issues related to crowd recommendation.

Important Dates:

Submission deadline: June 29, 2015
Notification: July 20, 2015
Camera-ready: July 27, 2015
Deadline for author registration: August 16, 2015
Workshop date:
Saturday, September 19, 2015 (afternoon)

Workshop website:
http://crowdrecworkshop.org

Workshop Organizers:
Martha Larson, Delft University of Technology, Netherlands
Domonkos Tikk, Gravity R&D, Hungary
Roberto Turrin, ContentWise R&D, Italy

For questions please contact Martha Larson [log in to unmask]

    ---------------------------------------------------------------
    For news of CHI books, courses & software, join CHI-RESOURCES
     mailto: [log in to unmask]

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to
     mailto:[log in to unmask]

    For further details of CHI lists see http://listserv.acm.org
    ---------------------------------------------------------------

ATOM RSS1 RSS2