ACM SIGMM Interest List


Options: Use Forum View

Use Proportional Font
Show HTML Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
dugelay - jean-luc dugelay - <[log in to unmask]>
Reply To:
Tue, 7 Feb 2023 13:09:17 +0100
text/plain (4 kB) , text/html (7 kB)

Date&Time: February 9, 2023 at 12:30 p.m. CET
[06:30 a.m. New-York] - [12:30 p.m. Paris] [13:30 p.m. Tampere]- [6:30 
p.m. Beijing]
Title: The Super Neuron Model
– A new generation of ANN-based Machine Learning and Applications
Speaker: Moncef Gabbouj, Tampere, University, Tampere, Finland.

To join the webinar, please register to receive more details on how to 
connect. The registration form can be found at:
or via the website of the journal at:
Contact: Esinu Abadjivor <[log in to unmask]>

Abstract: Operational Neural Networks (ONNs) are new generation network 
models targeting to address two major drawbacks of conventional 
Convolutional Neural Networks (CNNs): the homogenous network 
configuration and the “linear” neuron model that can only perform linear 
transformations over previous layer outputs. ONNs can perform any linear 
or non-linear transformation with a proper combination of “nodal” and 
“pool” operators. This is a great leap towards expanding the neuron’s 
learning capacity in CNNs, which thus far required the use of a single 
nodal operator for all synaptic connections for each neuron. This 
restriction has recently been lifted by introducing a superior neuron 
called the “generative neuron” where each nodal operator can be 
customized during the training in order to maximize learning. As a 
result, the network is able to self-organize the nodal operators of its 
neurons’ connections. Self-Organized ONNs (Self-ONNs) equipped with 
superior generative neurons can achieve diversity even with a compact 
configuration. We shall explore several signal processing applications 
of neural network models equipped with the superior neuron.

Speaker Bio: MONCEF GABBOUJ received his BS degree in 1985 from Oklahoma 
State University, and his MS and PhD degrees from Purdue University, in 
1986 and 1989, respectively, all in electrical engineering. Dr. Gabbouj 
is a Professor of Information Technology at the Department of Computing 
Sciences, Tampere University, Tampere, Finland. He was Academy of 
Finland Professor during 2011-2015. His research interests include Big 
Data analytics, multimedia content-based analysis, indexing and 
retrieval, artificial intelligence, machine learning, pattern 
recognition, nonlinear signal and image processing and analysis, voice 
conversion, and video processing and coding. Dr. Gabbouj is a Fellow of 
the IEEE and member of the Academia Europaea and the Finnish Academy of 
Science and Letters. He is the past Chairman of the IEEE CAS TC on DSP 
and committee member of the IEEE Fourier Award for Signal Processing. He 
served as associate editor and guest editor of many IEEE, and 
international journals and Distinguished Lecturer for the IEEE CASS. Dr. 
Gabbouj served as General Co-Chair of IEEE ISCAS 2019, ICIP 2020, ICIP 
2024 and ICME 2021. Gabbouj is Finland Site Director of the USA NSF 
IUCRC funded Center for Visual and Decision Informatics (CVDI) and led 
the Artificial Intelligence Research Task Force of Finland’s Ministry of 
Economic Affairs and Employment funded Research Alliance on Autonomous 
Systems (RAAS).

Webinar videos are available online at


jean-lucDUGELAY – Professor
Image Engineering & Security
IEEE, IAPR,  Fellow


PMI 677 801 4913
+33 (0)4 93 00 81 41




[log in to unmask]

If you don't already have a password for the LISTSERV.ACM.ORG server, we recommend
that you create one now. A LISTSERV password is linked to your email
address and can be used to access the web interface and all the lists to
which you are subscribed on the LISTSERV.ACM.ORG server.

To create a password, visit:


Once you have created a password, you can log in and view or change your
subscription settings at: