CHI-RESOURCES Archives

ACM SIGCHI Resources (Mailing List)

CHI-RESOURCES@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Roberto Muñoz <[log in to unmask]>
Reply To:
Roberto Muñoz <[log in to unmask]>
Date:
Fri, 17 Jan 2020 06:35:43 -0300
Content-Type:
text/plain
Parts/Attachments:
text/plain (90 lines)
Apologies for cross-posting.

*Call for papers *


*Special Issue "New Trends on Multimodal Learning Analytics: Using Sensors
to Understand and Improve Learning"*

Deadline for manuscript submissions: 1 April 2020.

*Journal: Sensors - Impact Factor 3.031(WoS - Q1)*

Authors should upload their contributions using the submission site:
https://www.mdpi.com/journal/sensors/special_issues/multimodal_learning_analytics_sensor
<https://mailtrack.io/trace/link/83c942aa2248cb6c3e1de20bf5b131a0856733fe?url=https%3A%2F%2Fwww.mdpi.com%2Fjournal%2Fsensors%2Fspecial_issues%2Fmultimodal_learning_analytics_sensor&userId=1684572&signature=10cfcbb4b3d83ee6>

Dear Colleagues,

Educational environments are transforming with digital technologies. In the
learning environments, the magistral class has been gradually abandoned,
and the learners are changing from observers to the protagonists of their
own learning. In this way, situations in which the learner produces unique
solutions, interact in groups, or must expose their ideas to their peers
are challenging to assess and generate appropriate feedback [1]. Under that
premise, the incorporation of sensors that allow capturing information of
the transformations that occur inside educational settings together with
them is essential for the continuous enhancement of educational processes.

Multimodal learning analytics (MMLA) is a subfield of learning analytics
that deals with data collected and integrated from different sources,
allowing a more panoramic understanding of the learning processes and the
different dimensions related to learning [2]. MMLA allows the observation
of interactions and nuances that are normally overlooked by traditional
learning analytics methods, given that the latter frequently exclusively
rely on computer-based data [3]. In this direction, introducing low-cost
sensors allows access to information from learners’ interactions with each
other and with their surroundings in physical space, which could not be
possible with traditional log data only. A wide range of sensors have been
used by MMLA experiments, ranging from those collecting students motoric
(body, head) and physiological (heart, brain, skin) behavior, to those
capturing social (proximity), situational, and environmental (location,
noise) contexts in which learners are placed [4].

This Special Issue focuses on all kinds of sensors used for collecting data
and conducting MMLA studies, as well as on the impacts of learning achieved
through the use of those sensors.

The topics of interest include but are not limited to:

   - Wearable trackers;
   - Multimodal classroom analytics;
   - Real-time multimodal data collection;
   - Feedback from multimodal data provided by (and through) sensors;
   - Data collection, analysis methods, and frameworks for MMLA;
   - All kinds of learning experimentations based on multimodal data
   (collaboration, mobility/location, body postures, gestures, etc.) in
   different contexts (oral presentation, problem-solving, lectures, etc.);
   - Multimodal data representation and visualization;
   - Challenges and limitations on processing and synchronizing data from
   multiple sources.

Other inquiries should be sent directly to any of the Guest Editors:

Prof. Dr. Roberto Muñoz ([log in to unmask])
Prof. Dr. Cristian Cechinel ([log in to unmask])
Dr. Mutlu Cukurova ([log in to unmask])
*Guest Editors*

--



*Dr. Roberto Muñoz Soto*Director
Escuela de Ingeniería Civil Informática
Facultad de Ingeniería
Universidad de Valparaíso
http://informatica.uv.cl
<https://mailtrack.io/trace/link/e7d2e730feef2b8aad01eeea62bda8011f8493a8?url=http%3A%2F%2Finformatica.uv.cl&userId=1684572&signature=875d4fde258aa3ab>
 | Scholar
<https://mailtrack.io/trace/link/8f87a3511bb8caa5ceb0b0e7399ca66db74c91d9?url=https%3A%2F%2Fgoo.gl%2FZKOWmA&userId=1684572&signature=3d1a51883e3a4eb7>
 | ResearchGate
<https://mailtrack.io/trace/link/f52c3ba49d8e76f3ddbeddeabdddecb718a0eae2?url=https%3A%2F%2Fwww.researchgate.net%2Fprofile%2FRoberto_Munoz28%2Fresearch&userId=1684572&signature=fdd5fd846f341ec2>
Sec: +56 32 260 3630   Of: +56 32 260 3638

    ---------------------------------------------------------------
                To unsubscribe, send an empty email to
       mailto:[log in to unmask]
    For further details of CHI lists see http://listserv.acm.org
    ---------------------------------------------------------------

ATOM RSS1 RSS2