Hindi Reading Comprehension : Do Large Language Models Exhibit Semantic Understanding?

Lal, Daisy Monika and Rayson, Paul and El-Haj, Mo (2025) Hindi Reading Comprehension : Do Large Language Models Exhibit Semantic Understanding? In: Proceedings of the First Workshop on Natural Language Processing for Indo-Aryan and Dravidian Languages :. Association for Computational Linguistics, Abu Dhabi, pp. 1-10. ISBN 9798891762145

[thumbnail of 2025.indonlp-1.1]
Text (2025.indonlp-1.1)
2025.indonlp-1.1.pdf - Published Version
Available under License Creative Commons Attribution.

Download (420kB)

Abstract

In this study, we explore the performance of four advanced Generative AI models—GPT-3.5, GPT-4, Llama3, and HindiGPT, for the Hindi reading comprehension task. Using a zero-shot, instruction-based prompting strategy, we assess model responses through a comprehensive triple evaluation framework using the HindiRC dataset. Our framework combines (1) automatic evaluation using ROUGE, BLEU, BLEURT, METEOR, and Cosine Similarity; (2) rating-based assessments focussing on correctness, comprehension depth, and informativeness; and (3) preference-based selection to identify the best responses. Human ratings indicate that GPT-4 outperforms the other LLMs on all parameters, followed by HindiGPT, GPT-3.5, and then Llama3. Preference-based evaluation similarly placed GPT-4 (80%) as the best model, followed by HindiGPT(74%). However, automatic evaluation showed GPT-4 to be the lowest performer on n-gram metrics, yet the best performer on semantic metrics, suggesting it captures deeper meaning and semantic alignment over direct lexical overlap, which aligns with its strong human evaluation scores. This study also highlights that even though the models mostly address literal factual recall questions with high precision, they still face the challenge of specificity and interpretive bias at times.

Item Type:
Contribution in Book/Report/Proceedings
Uncontrolled Keywords:
Research Output Funding/no_not_funded
Subjects:
?? no - not funded ??
ID Code:
227360
Deposited By:
Deposited On:
05 Feb 2025 11:30
Refereed?:
Yes
Published?:
Published
Last Modified:
21 Feb 2025 01:50