ACM SIGCHI General Interest Announcements (Mailing List)


Options: Use Forum View

Use Monospaced Font
Show HTML Part by Default
Condense Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Tue, 18 Feb 2020 14:10:23 +0000
Flavio Figueiredo <[log in to unmask]>
"ACM SIGCHI General Interest Announcements (Mailing List)" <[log in to unmask]>
text/plain (92 lines)
Call for Reproducibility Papers

RecSys strongly encourages the submission of algorithmic papers that repeat and analyze prior work. We distinguish between:
   * replicability papers, which repeat prior experiments using the original source code and datasets to show how, why, and when the methods work (or not); and
   * reproducibility papers, which repeat prior experiments using the original source code in new contexts (e.g., different application domains and datasets, different evaluation methodologies and metrics) to further generalize and validate (or not) previous work.

Submissions regarding replicability or reproducibility papers are welcome in all areas related to recommender systems (see the main track Call for Papers for a list of topics).

In both replicability and reproducibility papers, we expect authors to provide all materials required for repeating the tests performed, including code, data, and clear instructions on how to run the experiments. Submissions from the same authors of the reproduced experiments will not be accepted.

Each accepted paper will be included in the conference proceedings and presented in a plenary session as part of the main conference program. Each accepted paper will also be allocated a presentation slot in a poster session to encourage discussion and follow-up between authors and attendees.


Both replicability and reproducibility papers will be evaluated along the following criteria:
   * Novelty
   * What is new about the reproduced experiments?
   * Was the original work not supported from the theoretical point of view?
   * Were the original experiments not clear about important points or lacking confirmation for some of the original claims?
   * (Reproducibility papers only): Do the reproduced experiments bring more solid conclusions, with new datasets and metrics, with unbiased evaluation setups?
   * (Reproducibility papers only): Are there new experiments that allow for a better understanding of the impact of previous results?
   * Impact
   * How important is the reproduction of the experiments to the community?
   * How obvious are the conclusions achieved?
   * Do the reproduced prior works, if validated, advance a central topic to recommender systems (a topic with a broad applicability or focused on a hot research area)?
   * Reliability
   * Is the evaluation methodology in line with the research challenges addressed by the reproduced experiment?
   * Are the selected baselines representative of the several algorithm types and techniques available?
   * Is the hyperparameter tuning strategy properly described?
   * Are algorithms and baselines properly tuned?
   * Availability
   * Are the code and datasets used to reproduce the experiments available to the reviewers at the time of review?
   * Is the shared material released in a permanent repository for easy access by researchers?
   * Are the reproduced experiments well documented, with all the details required for other researchers to reproduce the experiments, as well?
   * Are there discrepancies between what is described in the paper and what is available in the shared material?
   * Is the shared material complete with everything you need to exactly replicate the experiments?


All submissions and reviews will be handled electronically. Papers must be submitted to PCS by 23:59, AoE (Anywhere on Earth) on May 4th, 2020. There will be no extensions to the submission deadline.

Formatting. ACM is changing the archive format of its publications to separate content from presentation in the new Digital Library, enhance accessibility, and improve the flexibility and resiliency of our publications. Following the new ACM publication workflow, all authors should submit manuscripts for review in a single-column format. Paper length is expected to range between 7 pages (for reproductions of a single algorithm) to 14 pages (for reproductions of multiple algorithms). In any case, the maximum length is 14 pages (excluding references) in the new single-column format. Instructions for Word and LaTeX authors are given below:

   * Microsoft Word: Write your paper using the Submission Template (Review Submission Format). Follow the embedded instructions to apply the paragraph styles to your various text elements. The text is in single-column format at this stage and no additional formatting is required at this point.
   * LaTeX: Please use the latest version of the Master Article Template - LaTeX to create your submission. You must use the "manuscript" option with the \documentclass[manuscript]{acmart} command to generate the output in a single-column format which is required for review. Please see the LaTeX documentation and ACM's LaTeX best practices guide for further instructions. To ensure 100% compatibility with The ACM Publishing System (TAPS), please restrict the use of packages to the whitelist of approved LaTeX packages.

Authors are strongly encouraged to provide "alt text" (alternative text) for floats (images, tables, etc.) in their content so that readers with disabilities can be given descriptive information for these floats that are important to the work. The descriptive text will be displayed in place of a float if the float cannot be loaded. This benefits the author as well as it broadens the reader base for the author's work. Moreover, the alt text provides in-depth float descriptions to search engine crawlers, which helps to properly index these floats.

Should you have any questions or issues going through the instructions above, please contact support at [log in to unmask] for both LaTeX and Microsoft Word inquiries.

Accepted papers will be later submitted to ACM's new production platform where authors will be able to review PDF and HTML output formats before publication.

Anonymity. Papers in the reproducibility track will undergo single-blind review. It is expected that at the time of submission, code and datasets used to reproduce the experiments will be available under reasonably liberal terms and sufficiently well-documented such that reviewers may consult that documentation as they conduct their reviews.

Originality. Each paper should not be previously published or accepted to any peer-reviewed journal or conference/workshop, nor currently under review elsewhere (including as another paper submission for RecSys 2020). We do not prevent authors from submitting the same paper to institutional or other preprint repositories such as before the reviewing process is complete, because it will place anonymity at risk. Please refer to the ACM Publishing License Agreement and Authorship Policy for further details.

Plagiarism. Plagiarized papers will not be accepted for RecSys 2020. Our committees will be checking the plagiarism level of all submitted papers to ensure content originality using an automated tool. Hence, authors are advised in their own interest to use a similar tool (e.g., iThenticate, Turnitin, Viper, PlagScan, etc.) to check the plagiarism level of their manuscripts before submission. The originality report generated by the tool may also be submitted at the time of paper submission.

Papers violating any of the above guidelines are subject to rejection without review.


RecSys 2020 is a SIGCHI conference and making a submission to a SIGCHI conference is a serious matter. Submissions require time and effort by SIGCHI volunteers to organize and manage the reviewing process, and, if the submission is accepted, the publication and presentation process. Thus, anyone who submits to RecSys 2020 implicitly confirms the following statements:
   1. I confirm that this submission is the work of myself and my co-authors.
   2. I confirm that I or my co-authors hold copyright to the content, and have obtained appropriate permissions for any portions of the content that are copyrighted by others.
   3. I confirm that any research reported in this submission involving human subjects has gone through the appropriate approval process at my institution.
   4. I confirm that if this paper is accepted, I or one of my co-authors will attend the conference. Papers that are not presented at the conference by an author may be removed from the proceedings at the discretion of the program chairs.


   * Abstract submission deadline: April 27th, 2020
   * Paper submission deadline: May 4th, 2020
   * Author notification: July 6th, 2020
   * Camera-ready version deadline: July 27th, 2020

Deadlines refer to 23:59 (11:59pm) in the AoE (Anywhere on Earth) time zone.


   * Paolo Cremonesi, Politecnico di Milano, Italy
   * Alan Said, University of Gothenburg, Sweden

    For news of CHI books, courses & software, join CHI-RESOURCES
     mailto: [log in to unmask]

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to
     mailto:[log in to unmask]

    For further details of CHI lists see