MM-INTEREST Archives

ACM SIGMM Interest List

MM-INTEREST@LISTSERV.ACM.ORG

Options: Use Classic View

Use Monospaced Font
Show HTML Part by Default
Condense Mail Headers

Topic: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Content-Type: multipart/alternative; boundary="------------tC1eewuwTSDo86K6zcHpGfpn"
Date: Wed, 17 Aug 2022 10:04:16 +0200
Reply-To: Çağrı Erdem <[log in to unmask]>
From: Çağrı Erdem <[log in to unmask]>
Message-ID: <[log in to unmask]>
MIME-Version: 1.0
Sender: ACM SIGMM Interest List <[log in to unmask]>
Parts/Attachments: text/plain (4 kB) , text/html (14 kB)
*Dear all,*

**

**

*(apologies for cross-posting) *

*

Please find and distribute the Call for Participation for the 
EmAI–Embodied Perspectives on Musical AI 
<https://www.uio.no/ritmo/english/news-and-events/events/workshops/2022/embodied-ai/index.html>– 
workshop below.



*

*Call for Participation*

*

We are now welcoming submissions for the workshop Embodied Perspectives 
on Musical AI that will take place in a hybrid format at the University 
of Oslo, Norway, on 21-22 November 2022.

Submission deadline: September 1, 2022, Anywhere on earth (AoE)


Embodiment, or, more concretely, musical embodiment, denotes how the 
body shapes our musical experiences. For example, you may exert more or 
less effort depending on the uncertainty of some musical situations or 
while playing technically challenging tasks. Such varying levels of 
effort during a live performance can lead to particular affective states 
resulting in bodily arousals. Or, you can also use your body 
functionally, such as full-body swaying to facilitate keeping the groove 
or nodding your head to signal your bandmate to return to the tune’s 
main melody. From an enactive perspective, our perception is shaped by 
our actions. As such, cognition emerges not just through information 
processing but mainly from the dynamic interaction between the agent and 
the environment. All in all, we experience the music with our body, 
using more modalities than hearing, regardless if we perform or listen 
to it.


Most musical artificial intelligence (AI) and multi-agent systems (MAS) 
focus on the music information found in the auditory domain. Modeling 
instrumental acoustics, synthesizing raw audio, or generating symbolic 
music data are highly complex tasks that AI can already accomplish to 
some extent. Still, it is unclear how these technologies can collaborate 
with musicking humans. How will machines perceive humans as diverse 
embodied entities, and how will humans communicate with machines 
exploiting multiple modalities? How can embodiment theories contribute 
to creating intelligent musical agents? At large, how will we make music 
with AI in the future? The interaction and artistic contexts, 
perceptual-motor constraints, affective states, environmental features, 
sensing devices, and computational systems are just a few that can come 
into play in an attempt to answer such questions.


An interdisciplinary research model encompassing natural sciences, 
humanities, cognition, and performing arts is often necessary to 
understand conventional forms of musical collaboration and create novel 
music technologies. Thus, in this two-day workshop, we will convene a 
group of scholars, artists, and engineers from diverse disciplines to 
emphasize embodied perspectiveson musical human-AI interactions. 
Together we will explore theories and practices in this intriguing 
domain and discuss musical AI's past, present, and future through the 
lenses of embodied cognition.


Submissions

The two-day workshop will consist of thematic sessions in addition to 
two keynote lectures and one evening performance. We invite submissions 
for the thematic sessions. The aim is to encourage dialogue so that 
presenters will give short 10-minute presentations followed by a panel 
discussion. The proposals can include demos, completed projects, 
works-in-progress, and provocations. We solicit anything of relevance to 
the topic of Embodied Perspectives on Musical AI. Short proposal texts 
(200-300 words) can be submitted at:https://nettskjema.no/a/emai 
<https://nettskjema.no/a/emai>.


Organizers

RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, 
University of Oslo

Interaction and Robotics Lab (Çağrı Erdem, Sayed Mojtaba Karbasi, 
Riccardo Simionato, Alexander Refsum Jensenius)

*

Çağrı Erdem [he/him]
Doctoral Research Fellow
RITMO Centre for Interdisciplinary Studies in Rhythm, Time, and Motion
University of Oslo
https://people.uio.no/cagrie   

############################

Unsubscribe:

[log in to unmask]

If you don't already have a password for the LISTSERV.ACM.ORG server, we recommend
that you create one now. A LISTSERV password is linked to your email
address and can be used to access the web interface and all the lists to
which you are subscribed on the LISTSERV.ACM.ORG server.

To create a password, visit:

https://LISTSERV.ACM.ORG/SCRIPTS/WA-ACMLPX.CGI?GETPW1

Once you have created a password, you can log in and view or change your
subscription settings at:

https://LISTSERV.ACM.ORG/SCRIPTS/WA-ACMLPX.CGI?SUBED1=MM-INTEREST


ATOM RSS1 RSS2