MM-INTEREST Archives

ACM SIGMM Interest List

MM-INTEREST@LISTSERV.ACM.ORG

Options: Use Forum View

Use Monospaced Font
Show HTML Part by Default
Condense Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Message-ID:
Sender:
ACM SIGMM Interest List <[log in to unmask]>
Subject:
From:
Hugo Oliveira Sousa <[log in to unmask]>
Date:
Mon, 21 Mar 2022 10:09:09 +0000
MIME-Version:
1.0
Content-Type:
multipart/alternative; boundary="_000_fc44f9c912e44d4bb9e003d6da9ca6c6inesctecpt_"
Reply-To:
Hugo Oliveira Sousa <[log in to unmask]>
Parts/Attachments:
text/plain (3416 bytes) , text/html (10 kB)
++ CALL FOR PARTICIPATION ++
It's just a couple of days until the Text2Story@ECIR’22 workshop (5th International Workshop on Narrative Extraction from Texts). Text2Story'22 will be a hybrid event to take place in Stavanger, Norway and Online (GMT+1). We invite interested researchers in this thematic to join us either in-person or online on the 10th of April. Registrations are open here: https://ecir2022.org/registration/
The workshop program consists of two keynote speakers, Antoine Doucet from the University of La Rochelle, France and Andreas Spitz from the University of Konstanz, Deutschland, and the presentation of 12 research papers. More details are at the conference website: http://text2story22.inesctec.pt

++ Invited Speakers ++
- Antoine Doucet [University of La Rochelle] who will give a talk entitled "Robust and multilingual analysis of historical documents"
- Andreas Spitz [University of Konstanz] who will give a talk entitled "We Have the Best Words: From the Web-scale Extraction and Attribution of Quotes to Analysing Negativity in U.S. Political Language"

++ List of Papers ++
- Time for some German? Pre-Training a Transformer-based Temporal Tagger for German [Satya Almasian, Dennis Aumiller and Michael Gertz]
- Understanding COVID-19 News Coverage using Medical NLP [Ali Emre, Veysel Kocaman, Hasham Ul Haq and David Talby]
- Changing the Narrative Perspective: From Ranking to Prompt-Based Generation of Entity Mentions [Mike Chen and Razvan Bunescu]
- EnDSUM: Entropy and Diversity based Disaster Tweet Summarization [Piyush Kumar Garg, Roshni Chakraborty and Sourav Kumar Dandapat]
- Simplifying News Clustering Through Projection From a Shared Multilingual Space [João Santos, Afonso Mendes and Sebastiao Miranda]
- Exploring Data Augmentation for Classification of Climate Change Denial: Preliminary Study [Jakub Piskorski, Nikolaidis Nikolaos, Nicolas Stefanovitch, Jens Linge, Bonka Kotseva and Irene Vianini]
- Dynamic change detection in topics based on rolling LDAs [Jonas Rieger, Kai-Robin Lange, Jonathan Flossdorf and Carsten Jentsch]
- Text2Icons: representing narratives with icon strips [Joana Valente, Alípio Jorge and Sérgio Nunes]
- Comprehensive contextual visualization of a news archive [Ishrat Sami, Tony Russell-Rose and Larisa Soldatova]
- Causality Mining in Fiction [Margaret Meehan, Andrew Piper and Dane Malenfant]
- Extracting Impact Model Narratives from Social Services’ Text  [Bart Gajderowicz and Mark Fox]
- MARCUS: An Event-Centric NLP Pipeline that generates Character Arcs from Narratives  [Sriharsh Bhyravajjula, Ujwal Narayan and Manish Shrivastava]

We hope to see you in Stavanger (or online) on the 10th of April.
Ricardo Campos, Alípio Jorge, Adam Jatowt, Sumit Bhatia and Marina Litvak [Text2Story 2022 Workshop Chairs]

############################

Unsubscribe:

[log in to unmask]

If you don't already have a password for the LISTSERV.ACM.ORG server, we recommend
that you create one now. A LISTSERV password is linked to your email
address and can be used to access the web interface and all the lists to
which you are subscribed on the LISTSERV.ACM.ORG server.

To create a password, visit:

https://LISTSERV.ACM.ORG/SCRIPTS/WA-ACMLPX.CGI?GETPW1

Once you have created a password, you can log in and view or change your
subscription settings at:

https://LISTSERV.ACM.ORG/SCRIPTS/WA-ACMLPX.CGI?SUBED1=MM-INTEREST


ATOM RSS1 RSS2