ACM SIGCHI General Interest Announcements (Mailing List)


Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Condense Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
"ACM SIGCHI General Interest Announcements (Mailing List)" <[log in to unmask]>
Mon, 8 Dec 2014 15:15:38 -0800
Nicolas Gonzalez Thomas <[log in to unmask]>
text/plain; charset=utf-8
Nicolas Gonzalez Thomas <[log in to unmask]>
text/plain (151 lines)

Call for Papers — ACM Computers in Entertainment — Special Issue on Musical Metacreation 

Musical Metacreation (MuMe) is an emerging term describing the body of research concerned with the automation of any or all aspects of musical creativity. It looks to bring together and build upon existing academic fields such as algorithmic composition, generative music, machine musicianship and live algorithms. It is understood as a branch of computational creativity, the study of autonomous systems that produce outputs that, if a human were to have produced them, would be deemed creative. It involves a broad community across research, practice and industry, with a growing body of literature that shows the automation of creative processes to be an active reality, not a distant research objective. The consolidation of these various strands of research in an active community is timely, as creative music software flourishes, creative digital music practice becomes ever more nuanced and innovative, and big-data and artificial intelligence expand their reach into a growing number of areas of human activity, the creative arts being no exception. These developments are set against a backdrop of philosophical enquiry into how the automation of creativity sheds light on human creativity and the possibility of artificial creativity. 

Emerging trends in current research include: an increased awareness of HCI techniques in studying and improving computationally creative systems; advances in the automation of evaluation and conceptual representation in computationally creative systems; deep learning; the weaving together of big-data services and web-technologies into collaborative and open-ended frameworks for computational creativity; automating the composition of longer temporal structures and less traditional aspects of musical creativity such as timbre. The purpose of this special issue is to provide an update on these aspects and more as the field becomes increasingly applied and practiced. 

This special issue of ACM Computers in Entertainment, associate-edited by Dr. Shlomo Dubnov (University of California in San Diego) and guest-edited by Dr Oliver Bown (Design Lab, University of Sydney), Dr Philippe Pasquier (SIAT, Simon Fraser University) and Dr Arne Eigenfeldt (SCA, Simon Fraser University), invites contributions of substantial research in any area relevant to musical metacreation. 

We welcome papers on any of the following themes: 

Representation and Algorithms for MuMe 


Novel representations of musical information 

Advances or applications of AI, machine learning, and statistical techniques for generative music 

Advances or applications of evolutionary computing or agent and multiagent-based systems for generative music 

Big data, crowdsourcing and distributed approaches in musical metacreation 

Systems and Applications of MuMe 


Systems for autonomous or interactive music composition 

Systems for automatic generation of expressive musical interpretation 

Systems for learning or modelling music style and structure 

Systems for intelligently remixing or recombining musical material 

Online musical systems (i.e. systems with a real-time element) 

Adaptive and generative music in video games 

Techniques and systems for supporting human musical creativity 

Applications of musical metacreation for digital entertainment: sound design, soundtracks, interactive art, etc. 

Evaluation of MuMe 


Methodologies for qualitative or quantitative evaluation of MuMe 

Studies reporting on the evaluation of MuMe 

Theory, and Social Impact of MuMe 


Computational models of human musical creativity 

Discussion of new genres and communities of practice related to MuMe 

Socio-economic impact, authorship and legal implications of MuMe 

For an idea of MuMe relevant material please review the proceedings of the Musical Metacreation Workshop. The workshop has been running since 2012 as part of the AAAI's AI in Interactive Digital Entertainment (AIIDE) conference: (search within this page for "Musical Metacreation"). Submit an Expression of Interest 

Only authors who submit an expression of interest will be considered for the special issue. 

Expressions of interest should include a paper outline of 500-1000 words, as well as a biography for all authors of 200 words in total. 

Paper outlines should discuss your previous work in the area, a summary of the novel contribution of the paper, and a statement of the work’s significance to the MuMe field. 

Please email your expression in PDF form to [log in to unmask] Dates 

Expressions of interest due: January 9th 2014. 

Response to expressions: on or before January 26th 2015. 

Final papers due: May 1st 2015. 

Editors decisions announced: July 3rd 2015. 

Submission of final camera-ready copies: September 4th 2015. 

Publication date TBC. 

Submission instructions for final manuscripts will be provided along with the editors' response to expressions. General submission guidelines can be found here . 

Please contact Oliver Bown ([log in to unmask]) if you have any further questions. 


Philippe Pasquier 
Associate Professor, 
School of Interactive Arts and Technology, 
Faculty of Communication, Art and Technology, 
Simon Fraser University, Vancouver, Canada. 
Mobile: +1 778-989-1240 | Phone: +1 778-782-8546 
Email: [log in to unmask] | 
Skype: pasquierphilippe | | | 

    For news of CHI books, courses & software, join CHI-RESOURCES
     mailto: [log in to unmask]

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to
     mailto:[log in to unmask]

    For further details of CHI lists see