Extractive question answering task. Given a question and an associated paragraph systems should locate the shortest span of text that contains the answer. There are no unanswerable questions. The dataset contains texts extracted from the Spanish Wikipedia, encyclopedic articles, newswire articles from Wikinew and news.
Publication
Asier Gutiérrez Fandiño, Jordi Armengol-Estapé, Marc Pàmies, Joan Llop-Palao,Joaquín Silveira-Ocampo,Casimiro Pio Carrino, Carme Armentano-Oller, Carlos Rodriguez-Penagos, Aitor Gonzalez-Agirre, Marta Villegas (2022) Procesamiento del Lenguaje Natural, Revista nº 68, marzo de 2022, pp. 39-60.
Language
Spanish
NLP topic
Abstract task
Dataset
Year
2022
Publication link
Ranking metric
F1
Task results
System | Precision | Recall | F1 Sort ascending | CEM | Accuracy | MacroPrecision | MacroRecall | MacroF1 | RMSE | MicroPrecision | MicroRecall | MicroF1 | MAE | MAP | UAS | LAS | MLAS | BLEX | Pearson correlation | Spearman correlation | MeasureC | BERTScore | EMR | Exact Match | F0.5 | Hierarchical F | ICM | MeasureC | Propensity F | Reliability | Sensitivity | Sentiment Graph F1 | WAC | b2 | erde30 | sent | weighted f1 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Xlm roberta large | 0.7895 | 0.7895 | 0.7895 | 0.7895 | 0.79 | ||||||||||||||||||||||||||||||||
PlanTL GOB ES roberta large bne | 0.7818 | 0.7818 | 0.7818 | 0.7818 | 0.78 | ||||||||||||||||||||||||||||||||
PlanTL GOB ES roberta base bne | 0.7584 | 0.7584 | 0.7584 | 0.7584 | 0.76 | ||||||||||||||||||||||||||||||||
Ixa ehu ixambert base cased | 0.7429 | 0.7429 | 0.7429 | 0.7429 | 0.74 | ||||||||||||||||||||||||||||||||
Bertin roberta base spanish | 0.7298 | 0.7298 | 0.7298 | 0.7298 | 0.73 | ||||||||||||||||||||||||||||||||
Dccuchile bert base spanish wwm cased | 0.7276 | 0.7276 | 0.7276 | 0.7276 | 0.73 | ||||||||||||||||||||||||||||||||
Xlm roberta base | 0.6988 | 0.6988 | 0.6988 | 0.6988 | 0.70 | ||||||||||||||||||||||||||||||||
Bert base multilingual cased | 0.6976 | 0.6976 | 0.6976 | 0.6976 | 0.70 | ||||||||||||||||||||||||||||||||
Distilbert base multilingual cased | 0.5566 | 0.5566 | 0.5566 | 0.5566 | 0.56 | ||||||||||||||||||||||||||||||||
distilbert-base-multilingual-cased | 0.5500 |
Pagination
- Page 1
- Next page