The task aims at finding the best techniques to identify propagandistic tweets from governmental and diplomatic sources on a dataset of 9501 tweets in Spanish, posted by authorities of China, Russia, United States and the European Union. It consists on determining whether a tweet has propaganda techniques or not.
Publication
Pablo Moral, Guillermo Marco, Julio Gonzalo, Jorge Carrillo-de-Albornoz, Iván Gonzalo-Verdugo (2023) Overview of DIPROMATS 2023: automatic detection and characterization of propaganda techniques in messages from diplomats and authorities of world powers. Procesamiento del Lenguaje Natural, Revista nº 71, septiembre de 2023, pp. 397-407.
Language
Spanish
NLP topic
Abstract task
Dataset
Year
2023
Publication link
Ranking metric
F1
Task results
System | Precision | Recall | F1 Sort ascending | CEM | Accuracy | MacroPrecision | MacroRecall | MacroF1 | RMSE | MicroPrecision | MicroRecall | MicroF1 | MAE | MAP | UAS | LAS | MLAS | BLEX | Pearson correlation | Spearman correlation | MeasureC | BERTScore | EMR | Exact Match | F0.5 | Hierarchical F | ICM | MeasureC | Propensity F | Reliability | Sensitivity | Sentiment Graph F1 | WAC | b2 | erde30 | sent | weighted f1 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
XLM-RoBERTa-large | 0.8224 | 0.8224 | 0.8224 | 0.8224 | 0.82 | ||||||||||||||||||||||||||||||||
XLM-RoBERTa-large-2 | 0.8224 | 0.8224 | 0.8224 | 0.8224 | 0.82 | ||||||||||||||||||||||||||||||||
XLM-RoBERTa-large-v3 | 0.8224 | 0.8224 | 0.8224 | 0.8224 | 0.82 | ||||||||||||||||||||||||||||||||
Hermes-3-Llama-3.1-8B_2 | 0.8211 | 0.8211 | 0.8211 | 0.8211 | 0.82 | ||||||||||||||||||||||||||||||||
Xlm roberta large | 0.8186 | 0.8186 | 0.8186 | 0.8186 | 0.82 | ||||||||||||||||||||||||||||||||
PlanTL GOB ES roberta large bne | 0.8177 | 0.8177 | 0.8177 | 0.8177 | 0.82 | ||||||||||||||||||||||||||||||||
Hermes-3-Llama-3.1-8B | 0.8168 | 0.8168 | 0.8168 | 0.8168 | 0.82 | ||||||||||||||||||||||||||||||||
PlanTL GOB ES roberta base bne | 0.8149 | 0.8149 | 0.8149 | 0.8149 | 0.81 | ||||||||||||||||||||||||||||||||
Gemma-2B-IT | 0.8109 | 0.8109 | 0.8109 | 0.8109 | 0.81 | ||||||||||||||||||||||||||||||||
Dccuchile bert base spanish wwm cased | 0.7916 | 0.7916 | 0.7916 | 0.7916 | 0.79 |
Pagination
- Page 1
- Next page