TY - JOUR
T1 - Tonirodriguez at CheckThat!2024
T2 - 25th Working Notes of the Conference and Labs of the Evaluation Forum, CLEF 2024
AU - Rodríguez, Antonio
AU - Golobardes, Elisabet
AU - Suau, Jaume
N1 - Publisher Copyright:
© 2024 Copyright for this paper by its authors.
PY - 2024
Y1 - 2024
N2 - Subjectivity detection is a key task within natural language processing due to the challenges generated by new forms of journalism, the proliferation of misinformation and fake news, and existing concerns about the quality and integrity of journalism. Although subjectivity detection is an existing challenge in all languages, the amount of resources available to build these types of applications varies greatly among languages. In this paper, we present our participation in the CLEF2024 CheckThat! Lab Task2 [1], where we have attempted to apply Zero-Shot Cross-Lingual transfer techniques using the datasets for the five languages provided in Task2 (English, German, Italian, Bulgarian, and Arabic). For this, we have fine-tuned two multilingual models, mDeBERTa v3 and XLM-RoBERTa, on a subset of the dataset consisting of three of the languages provided in Task2, specifically English, German, and Italian, and we have applied Zero-Shot Cross-Lingual transfer to the other two languages available in Task2, Arabic and Bulgarian.
AB - Subjectivity detection is a key task within natural language processing due to the challenges generated by new forms of journalism, the proliferation of misinformation and fake news, and existing concerns about the quality and integrity of journalism. Although subjectivity detection is an existing challenge in all languages, the amount of resources available to build these types of applications varies greatly among languages. In this paper, we present our participation in the CLEF2024 CheckThat! Lab Task2 [1], where we have attempted to apply Zero-Shot Cross-Lingual transfer techniques using the datasets for the five languages provided in Task2 (English, German, Italian, Bulgarian, and Arabic). For this, we have fine-tuned two multilingual models, mDeBERTa v3 and XLM-RoBERTa, on a subset of the dataset consisting of three of the languages provided in Task2, specifically English, German, and Italian, and we have applied Zero-Shot Cross-Lingual transfer to the other two languages available in Task2, Arabic and Bulgarian.
KW - Cross-lingual
KW - Fake News
KW - Journalism
KW - Misinformation
KW - Natural Language Processing
KW - Subjectivity Detection
KW - Transfer Learning
KW - Transformers
UR - http://www.scopus.com/inward/record.url?scp=85201580523&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85201580523
SN - 1613-0073
VL - 3740
SP - 590
EP - 597
JO - CEUR Workshop Proceedings
JF - CEUR Workshop Proceedings
Y2 - 9 September 2024 through 12 September 2024
ER -