Saifullah Razali, University of Hertfodshire, Singapore
Figurative language: metaphor, simile, idiom, hyperbole, sarcasm, and irony encodes meaning that often departs from literal interpretation and is crucial across literature, social media, and educational content. Detecting such language remains challenging for NLP systems because it requires pragmatic, cultural and world knowledge. This paper presents a thorough study of figurative language detection using pretrained language models (PLMs). We review linguistic foundations, describe architectures and training strategies using PLMs (BERT, RoBERTa, and GPT-style models), present an experimental framework, and report results drawing on recent benchmark datasets and shared tasks.
Figurative Language, Metaphor Detection, Sarcasm Detection, Pretrained Language Models, BERT, RoBERTa, GPT