The task consists on detecting disability mentions in abstracts of biomedical articles in English. The task follows the guidelines established in the IberLEF 2018 competition "Disability annotation on documents from the biomedical domain (DIANN)".
Publication
Hermenegildo Fabregat, Juan Martínez-Romo, and Lourdes Araujo. 2018b. Overview of the DIANN task: Disability annotation task. In Proceedings of the Third Workshop on Evaluation of Human Language Technologies for Iberian Languages (IberEval 2018) co-located with 34th Conference of the Spanish Society for Natural Language Processing (SEPLN 2018), Sevilla, Spain, September 18th, 2018, volume 2150 of CEUR Workshop Proceedings, pages 1–14. CEUR-WS.org.
Language
English
Abstract task
Dataset
Year
2023
Publication link
Ranking metric
F1
Task results
System | Precision | Recall | F1 Sort ascending | CEM | Accuracy | MacroPrecision | MacroRecall | MacroF1 | RMSE | MicroPrecision | MicroRecall | MicroF1 | MAE | MAP | UAS | LAS | MLAS | BLEX | Pearson correlation | Spearman correlation | MeasureC | BERTScore | EMR | Exact Match | F0.5 | Hierarchical F | ICM | MeasureC | Propensity F | Reliability | Sensitivity | Sentiment Graph F1 | WAC | b2 | erde30 | sent | weighted f1 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Roberta large | 0.7982 | 0.7982 | 0.7982 | 0.7982 | 0.80 | ||||||||||||||||||||||||||||||||
Xlm roberta large | 0.7740 | 0.7740 | 0.7740 | 0.7740 | 0.77 | ||||||||||||||||||||||||||||||||
Ixa ehu ixambert base cased | 0.7695 | 0.7450 | 0.7450 | 0.7695 | 0.75 | ||||||||||||||||||||||||||||||||
Roberta base | 0.7612 | 0.7612 | 0.7612 | 0.7612 | 0.76 | ||||||||||||||||||||||||||||||||
Xlm roberta base | 0.7438 | 0.7438 | 0.7438 | 0.7438 | 0.74 | ||||||||||||||||||||||||||||||||
Bert base multilingual cased | 0.7384 | 0.7384 | 0.7384 | 0.7384 | 0.74 | ||||||||||||||||||||||||||||||||
Bert base cased | 0.7364 | 0.7364 | 0.7364 | 0.7364 | 0.74 | ||||||||||||||||||||||||||||||||
Distilbert base uncased | 0.6966 | 0.6966 | 0.6966 | 0.6966 | 0.70 | ||||||||||||||||||||||||||||||||
Distilbert base multilingual cased | 0.6950 | 0.6950 | 0.6950 | 0.6950 | 0.69 |