La tarea consiste en reconocer menciones de discapacidades en resúmenes de artículos biomédicos en inglés. La tarea sigue las pautas establecidas en la competición de IberLEF 2018 “Disability annotation on documents from the biomedical domain (DIANN)”.
Publicación
Hermenegildo Fabregat, Juan Martínez-Romo, and Lourdes Araujo. 2018b. Overview of the DIANN task: Disability annotation task. In Proceedings of the Third Workshop on Evaluation of Human Language Technologies for Iberian Languages (IberEval 2018) co-located with 34th Conference of the Spanish Society for Natural Language Processing (SEPLN 2018), Sevilla, Spain, September 18th, 2018, volume 2150 of CEUR Workshop Proceedings, pages 1–14. CEUR-WS.org.
Idioma
Inglés
Tarea abstracta
Dataset
Año
2023
Enlace publicación
Métrica Ranking
F1
Mejores resultados para la tarea
Sistema | Precisión | Recall | F1 Ordenar ascendente | CEM | Accuracy | MacroPrecision | MacroRecall | MacroF1 | RMSE | MicroPrecision | MicroRecall | MicroF1 | MAE | MAP | UAS | LAS | MLAS | BLEX | Pearson correlation | Spearman correlation | MeasureC | BERTScore | EMR | Exact Match | F0.5 | Hierarchical F | ICM | MeasureC | Propensity F | Reliability | Sensitivity | Sentiment Graph F1 | WAC | b2 | erde30 | sent | weighted f1 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Roberta large | 0.7982 | 0.7982 | 0.7982 | 0.7982 | 0.80 | ||||||||||||||||||||||||||||||||
Xlm roberta large | 0.7740 | 0.7740 | 0.7740 | 0.7740 | 0.77 | ||||||||||||||||||||||||||||||||
Ixa ehu ixambert base cased | 0.7695 | 0.7450 | 0.7450 | 0.7695 | 0.75 | ||||||||||||||||||||||||||||||||
Roberta base | 0.7612 | 0.7612 | 0.7612 | 0.7612 | 0.76 | ||||||||||||||||||||||||||||||||
Xlm roberta base | 0.7438 | 0.7438 | 0.7438 | 0.7438 | 0.74 | ||||||||||||||||||||||||||||||||
Bert base multilingual cased | 0.7384 | 0.7384 | 0.7384 | 0.7384 | 0.74 | ||||||||||||||||||||||||||||||||
Bert base cased | 0.7364 | 0.7364 | 0.7364 | 0.7364 | 0.74 | ||||||||||||||||||||||||||||||||
Distilbert base uncased | 0.6966 | 0.6966 | 0.6966 | 0.6966 | 0.70 | ||||||||||||||||||||||||||||||||
Distilbert base multilingual cased | 0.6950 | 0.6950 | 0.6950 | 0.6950 | 0.69 |