MM-INTEREST Archives

ACM SIGMM Interest List

MM-INTEREST@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show HTML Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Feng XIA <[log in to unmask]>
Reply To:
Date:
Fri, 3 Mar 2023 22:02:39 +1100
Content-Type:
multipart/alternative
Parts/Attachments:
text/plain (4 kB) , text/html (5 kB)
[Please accept our apologies if you received multiple copies of this call]

Submission Deadline: 1 July 2023 (extended & hard deadline)
Early submissions are encouraged/preferred.

CALL FOR PAPERS

IEEE Transactions on Neural Networks and Learning Systems (IEEE TNNLS)
Special Issue on Graph Learning

https://www.xia.ai/tnnls-si-gl
https://cis.ieee.org/publications/t-neural-networks-and-learning-systems/ieee-transactions-on-neural-networks-and-learning-systems

[Editor-in-Chief]
Yongduan Song, Chongqing University, China

[Guest Editors]
- Feng Xia, RMIT University, Australia
- Renaud Lambiotte, University of Oxford, United Kingdom
- Neil Shah, Snap Research, USA
- Hanghang Tong, University of Illinois Urbana-Champaign, USA
- Irwin King, The Chinese University of Hong Kong, Hong Kong

[Introduction]
Graphs (or networks) are a powerful data structure. The vast majority of
real-world scenarios involve graphs, for instance, social networks, traffic
networks, neural networks, biological networks, communication networks, and
knowledge graphs, just to name a few. However, classical deep learning and
machine learning algorithms cannot be directly applied to many graph-based
domains due to the characteristics of graph data that lie in an irregular
domain (i.e., non-Euclidean space).

Graph learning (a.k.a. graph machine learning or machine learning on
graphs) has attracted huge research attention over the past few years
thanks to its great potential. For example, graph learning brings the
advantageous and significant ability to exploit the topological structure
of graphs. Moreover, graph learning can recursively aggregate information
from nodes’ neighbours to learn the feature vector of all nodes. The use of
graph learning methods, such as graph neural networks, network embedding,
representation learning, have led to unprecedented progress in solving many
challenges facing real-world applications, such as recommender systems,
anomaly detection, smart surveillance, traffic forecasting, disease control
and prevention, medical diagnosis, and drug discovery. Despite rapid
emergence and significant advancement, the field of graph learning is
facing various challenges deriving from, e.g., fundamental theory and
models, algorithms and methods, supporting tools and platforms, and
real-world deployment and engineering.

This special issue will feature the most recent research results in graph
learning. The issue welcomes both theoretical and applied research. It will
encourage the effort to share data, advocate gold-standard evaluation among
shared data, and promote the exploration of new directions.

[Scope of the Special Issue]
Topics of interest includes (but not limited to):
- Foundations and principles of graph learning
- Novel machine learning models and algorithms over graphs
- Graph neural networks
- Deep learning on graphs
- Graph mining and analytics
- Network representation learning
- Learning on temporal, large-scale, and/or complex graphs
- Responsible and trustworthy graph learning
- Knowledge-informed graph learning
- Robustness and adversarial attacks on graphs
- Geometric machine learning
- Graph theory and network science for machine learning
- Knowledge graphs
- Graph datasets and benchmarks
- Graph learning systems, platforms, and applications in various domains

[Submission Instructions]
- Read the Information for Authors at http://cis.ieee.org/tnnls
- Submit your manuscript through ScholarOne Manuscripts (
http://mc.manuscriptcentral.com/tnnls) and choose “Special Issue: Graph
Learning” as Type in Step 1: Type, Title, & Abstract.
- Early submissions are encouraged/preferred. We will start the review
process as soon as we receive a submission.

############################

Unsubscribe:

[log in to unmask]

If you don't already have a password for the LISTSERV.ACM.ORG server, we recommend
that you create one now. A LISTSERV password is linked to your email
address and can be used to access the web interface and all the lists to
which you are subscribed on the LISTSERV.ACM.ORG server.

To create a password, visit:

https://LISTSERV.ACM.ORG/SCRIPTS/WA-ACMLPX.CGI?GETPW1

Once you have created a password, you can log in and view or change your
subscription settings at:

https://LISTSERV.ACM.ORG/SCRIPTS/WA-ACMLPX.CGI?SUBED1=MM-INTEREST


ATOM RSS1 RSS2