ACM SIGCHI General Interest Announcements (Mailing List)


Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Alan Said <[log in to unmask]>
Reply To:
Alan Said <[log in to unmask]>
Fri, 29 Nov 2013 16:31:49 +0100
text/plain (92 lines)
Call for Papers

ACM Transactions on Intelligent Systems and Technology

Special Issue on Recommender System Benchmarking

Recommender systems add value to vast content resources by matching users with items of interest. In recent years, immense progress has been made in recommendation techniques. The evaluation of these systems is still based on traditional information retrieval and statistics metrics, e.g. precision, recall, RMSE often not taking the use-case and situation of the system into consideration.

However, the rapid evolution of recommender systems in both their goals and their application domains foster the need for new evaluation methodologies and environments.

This special issue serves as a venue for work on novel, recommendation-centric benchmarking approaches taking the users� utility, the business values and the technical constraints into consideration.

New evaluation approaches should evaluate both functional and non-functional requirements. Functional requirements go beyond traditional relevance metrics and focus on user-centered utility metrics, such as novelty, diversity and serendipity.

Non-functional requirements focus on performance (e.g., scalability of both model building and on-line recommendation phases) and reliability (e.g., consistency of recommendations with time, robustness to incomplete, erroneous or malicious input data).

Topics of Interests
We invite the submission of high-quality manuscripts reporting relevant research in the area of benchmarking and evaluation of recommendation systems. The special issue welcomes submissions presenting technical, experimental, methodological and/or applicative contributions in this scope, addressing -though not limited to- the following topics:

	• New metrics and methods for the quality estimation of recommender systems
	• Mapping metrics to business goals and values
	• Novel frameworks for the user-centric evaluation of recommender systems
	• Validation of off-line methods with online studies
	• Comparison of evaluation metrics and methods
	• Comparison of recommender algorithms across multiple systems and domains
	• Measuring technical constraints vs. accuracy
	• Robustness of recommender systems to missing, erroneous or malicious data
	• Evaluation methods in new application scenarios (cross domain, live/stream recommendation)
	• New datasets for the evaluation of recommender systems
	• Benchmarking frameworks
	• Multiple-objective benchmarking
	• Real benchmarking experiences (from benchmarking event organizers)

Manuscripts shall be sent through the ACM TIST electronic submission system at (please select "Special Issue: Recommender System Benchmarking" as the manuscript type). Submissions shall adhere to the ACM TIST instructions and guidelines for authors available at the journal website:

The papers will be evaluated for their originality, contribution significance, soundness, clarity, and overall quality. The interest of contributions will be assessed in terms of technical and scientific findings, contribution to the knowledge and understanding of the problem, methodological advancements, and/or applicative value.

Important Dates

Paper submission due:         January 20th, 2014
First round of reviews:         March 15th, 2014
First round of revisions:         April 15th, 2014
Second round of reviews:         May 15th, 2014
Final round of revisions:         June 15th, 2014
Final paper notification:        July 15th, 2014
Camera-ready due:                 August 2014

Guest Editors
Paolo Cremonesi - Politecnico di Milano
	• [log in to unmask]

Alan Said - CWI
	• [log in to unmask]

Domonkos Tikk - Gravity R&D
	• [log in to unmask]

Michelle X. Zhou - IBM Research
	• [log in to unmask]

Dr. Alan Said
Information Access research group
CWI - Centrum Wiskunde & Informatica
Room M345, Science Park 123
P.O. Box 94079, 1090 GB Amsterdam
e: [log in to unmask]
t: @alansaid

    For news of CHI books, courses & software, join CHI-RESOURCES
     mailto: [log in to unmask]

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to
     mailto:[log in to unmask]

    For further details of CHI lists see