CHI-ANNOUNCEMENTS Archives

ACM SIGCHI General Interest Announcements (Mailing List)

CHI-ANNOUNCEMENTS@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
"Drachsler, Hendrik" <[log in to unmask]>
Reply To:
Drachsler, Hendrik
Date:
Thu, 12 Jan 2017 17:17:50 +0100
Content-Type:
text/plain
Parts/Attachments:
text/plain (78 lines)
***********************************************
CALL FOR JOURNAL PAPERS

Special Issue on MMDL (Multimodal Data for Learning)
Journal of Computer Assisted Learning (JCAL) Impact factor: 1.679
http://onlinelibrary.wiley.com/journal/10.1111/(ISSN)1365-2729

Deadline of submissions: 08 May 2017
************************************************

SCOPE
*********
The transition of our world from an analog to a digital one affects all aspects of the society and also the educational sector. In recent years we have gained insights into the learning behaviour by investigating existing data sources such as learning management systems, mobile applications, and social media environments with analytic methods.

While these data sources can still provide a rich ground for research, a new wave of technological innovations is taking place with the Internet of Things (IoT) and the maker movement. IoT devices provide new applications and affordances for everyday life. Wearables, eye-trackers and other camera systems, self-programmable microcomputers such as Raspberry Pi and Arduino create new data sources which can be used to investigate learning. These new data sources are creating so-called multimodal datasets as they combine different data sources from physical activities, physiological responses with more traditional learning data. Alternative to traditional learning data collections, multimodal data sets require manifold data collection methods to combine the diverse data streams.

The new multimodal data research approaches promise to provide a more holistic picture about learners and the success factors for learning. But multimodal data is much more diverse and heterogeneous than data available from traditional learning environments. It is challenging to combine various data types such as text, assessments, activities, physiological data, and video for research purposes and gaining meaningful results.

In the nature of the JCAL journal, we are interested in empirical studies that take advantage of multimodal data sources to enrich or investigate learning and teaching. We therefore explicitly look for research that can show effects of multimodal data on learning and teaching sciences. Also, literature studies and results of technology infrastructures with new insights are invited for this call.

TOPICS
**********
Relevant topics include, but are not limited to:

 *   multimodal representation of learning
 *   multimodal learning behaviour modeling
 *   real-time data collection
 *   multimodal data mining technologies
 *   multimodal data interpretation
 *   open data sources
 *   supporting feedback and reflection with multimodal data
 *   multimodal data learning analytics
 *   wearable computing for learning
 *   new educational approaches with/for multimodal learning

REVIEW
**********
According to the covered main subjects in the content, a selected set of reviewers with the appropriate expertise in learning analytics, multimodal data, technology enhanced learning, and e-learning/computer science will be assigned.

SUBMISSIONS
******************

 *   Authors are invited to submit original unpublished research as papers. All submitted papers will be peer-reviewed for originality, significance, clarity, and quality.
 *   The journal operates double blind peer review, so the authors must provide their title page as a separate file from their main document. Title page includes the complete title of the paper, affiliation and contact details for the corresponding author (both postal address and email address).
 *   Practitioner Notes - section outlining - in bullet point form - what is currently known about the subject matter, what their paper adds to this, and finally the implications of study findings for practitioners. This should be no more than four bullet points per section of approximately 80 characters in order to maintain clarity.
 *   Manuscripts should be submitted electronically via the online submission site http://mc.manuscriptcentral.com/jcal.
 *   Maximum characters is 8000 words and abstract 200 words
 *   A template can be found here: http://mc.manuscriptcentral.com/societyimages/jcal/APA6%20template.dotx
 *   Further information about submission and layout can be fund here: http://onlinelibrary.wiley.com/journal/10.1111/(ISSN)1365-2729/homepage/ForAuthors.html

IMPORTANT DATES
************************

 *   Submission of manuscripts: 08 May 2017
 *   Completion of first review: 30 June 2017
 *   Submission of revised manuscripts: 30 July 2017
 *   Final decision notification: 01 September 2017
 *   Copy Editing Version: 30 September 2017
 *   Publication date: 10 November 2017

Any questions should be sent to: [log in to unmask]<mailto:[log in to unmask]>


________________________________
Deze e-mail is uitsluitend bestemd voor de geadresseerde(n). Verstrekking aan en gebruik door anderen is niet toegestaan. Open Universiteit sluit iedere aansprakelijkheid uit die voortvloeit uit elektronische verzending. Aan de inhoud van deze e-mail en/of eventueel toegevoegde bijlagen kunnen geen rechten worden ontleend.

This e-mail is intended exclusively for the addressee(s), and may not be passed on to, or made available for use by any person other than the addressee(s). Open Universiteit rules out any and every liability resulting from any electronic transmission. No rights may be derived from the contents of this message.

    ---------------------------------------------------------------
    For news of CHI books, courses & software, join CHI-RESOURCES
     mailto: [log in to unmask]

    To unsubscribe from CHI-ANNOUNCEMENTS send an email to
     mailto:[log in to unmask]

    For further details of CHI lists see http://listserv.acm.org
    ---------------------------------------------------------------

ATOM RSS1 RSS2